August 17, 2017 at 12:55 pm #143601
I was searching online for more answers but I can’t seem to find any and when I call the support team to have more info they just send me the link to the minimum hardware requirements specs.
We are planning a huge hardware upgrade for are in store designers and I wanted to know about the hardware needs of 2020.
First of all, I wanted to know if 2020 design was more CPU hungry or GPU hungry, the designers usually render a lot with it.
I also wanted to know if 2020 design will actually use more than 4 cores since I was planning to get AMD Ryzen machine built with 8 cores and 16 threads.
Also do 2020 Design function better with AMD GPU or Nvidia GPU?
August 17, 2017 at 1:33 pm #143602
2020 is alternately GPU and then CPU hungry. For any renderings that come up inside the software like images that are shown when accessing attributes or the initial “working” rendering 2020 uses the GPU. For the final “hi Res” rendering it uses the CPU.
I do not believe that 2020 is currently multi-threaded and so will probably not take full advantage of 8 cores and 16 threads. That said however, you will get better final rendering performance out of a higher end CPU. Personally I prefer the Intel I-7s over any AMD processor but that is just me.
It has been my experience as a support tech here at 2020 that the software performs best on the Nvidia Geforce cards but again, that is personal opinion, not corporate policy.
I hope that is what you were looking for.
August 20, 2017 at 9:13 pm #143858
Just FYI, the Ryzen might look like a better processor for the price and I haven’t tested it myself but in previous testing, the AMD processors fell FAR short of the Intel ones when it came to general 2020 use and rendering. IMHO I’d always go for the fastest Intel i7 I could afford.
August 21, 2017 at 9:43 am #143859
I will give the Ryzen platform a test since it is a new architecture and untested waters compared to the old AMD processors, I will test it against one of the top i7 in the same range.
Could you provide me a little more info about the GPU? I really don’t want to test 4 GPUS haha
I was thinking about testing a 1070, 1080 and 1080 Ti.
August 21, 2017 at 8:33 pm #144305
I tend to favour Nvidia over AMD but there really doesn’t seem to be a lot in it. As an example, We have just built an i7 7820 desktop with SSD, 32GB RAM and an Nvidia Quadro 2000 5GB and it’s only about 15-20% faster on renders than my Toshiba notebook with i7 4710 with 16GB of RAM and an AMD R9 M200 with 2GB.
The rendering engine in 2020 is a third party product and really not particularly good.
November 23, 2017 at 1:17 pm #161547shulem msokovitsParticipant
I just got a new workstation for 2020 rendering the specs are as following
GTX 1050 Ti
500Gb m.2 SSD
when i do renderings the CPU goes up to 100% and barely uses the GPU, is there any settings that i need to configure in order it should render of the GPU?
November 23, 2017 at 8:11 pm #161612
I too find that 2020 barely uses the GPU. V11.7 onwards is a bit better on rendering than earlier versions as the rendering engine has been split off but it still seems (under Windows 10) to be unable to detect and use the GPU adequately.
Ensure that the Nvidia is set as the default graphics processor for Design.exe and LeRendu.exe (actually, make it the default for everything!).
Let’s hope this changes soon as most of us have invested in good GPUs only to find that it is mainly the main processor that does all the work!.
January 4, 2018 at 6:47 pm #165104
Hi everyone – pardon the account. My wife’s the designer and I’m the husband who works in IT.
I repurposed an old server for her a few years ago with a pair of Xeon X5650 (2.66Ghz, 6c/12t) processors in it – a total of 24 process threads. It plowed through renders but had some other issues – namely it’s a server-class system that didn’t support things like standby or audio very well. As a workstation, it wasn’t cutting it – plus, at nearly 7 years old it was getting old and throwing some other odd issues.
I rebuilt the machine on a bit of a budget this time around using an i7 7700K (4.2Ghz, 4c/8t), good price on Amazon at the time. All-around this computer is a superior workstation, however it’s HQ render times increased about four-fold. Well, I went from 24 threads to 8 threads… makes sense to me.
Today, one of my colleagues had a Dell Precision with a Xeon W-2104 (3.2Ghz, 4c/8t) in it delivered. I figured I’d try to do some benchmarking before putting it in to production.
Consider all other factors equal. NVMe SSDs, 16-48Gb RAM (anything >16Gb has no effect I can tell), GTX1060 or Quadro P2000 video cards, etc. I’m only looking at raw compute speed in HQ rendering mode. The attached is a test kitchen she whipped up real fast just as a rendering test (maybe it’s a standard demo? I donno). It only uses the default catalogs so anyone can open it.
I was completely blown away that 2020 doesn’t have a fixed resolution for rendering HQ images and instead relies on window size for this. As a side question – is there any way to specify the resolution for an HQ render before opening the HQ render screen?
For the purposes of my benchmarking, the window was simply full-screened (so very close to 1920×1200 or 1920×1080, depending on the rig). I’ve watched clock speeds and utilization levels across the systems and can confirm it really doesn’t care what video card you have in HQ render – that’s 100% CPU based. GPU is only used for the fast’n’dirty rendering done while you move your view around.
Anywho – here’s what I saw on the various platforms:
- 2x X5650: 44s
- i7 7700K: 2m53s
- W-2104: 6m48s
As best I can tell, the move to Xeon – aside from getting high thread counts – has no positive impact at all on 2020. It may even be a detriment, for some reason. All this thing cares about is the number of threads you can give it.
That said – I’ll send a beer’s worth of money to anyone out there with an 1950X Threadripper who can run this render for me and tell me how long it took. I really don’t want to plunk down another $1100 on her computer if it’s not going to be a sizeable improvement.
Attachments:You must be logged in to view attached files.
January 4, 2018 at 7:13 pm #165117
I’ve always pushed the Intel i7 for my users (Xeons were too expensive/not easy to find in a notebook) and AMD just didn’t render at anything like the speed. However, I am also very interested in the Threadripper – it’s an excellent price for such a powerful (on paper) processor. Unfortunately, my budget will not justify buying one as a test bed so I am hoping that your post will bring someone out of the woodwork!
I agree with you on the rendering side – the render seems to max out the processor.
January 4, 2018 at 7:28 pm #165126
Thanks! I’m tapping some of my friends to see if anyone has a rig to test on… else I’m just going to find one as a demo at a store and throw 2020 on it to find out there.
I really wish there was a CLI rendering mode that I could throw a controlled set of commands at 2020 with to ensure the same test every time (view, quality, resolution, etc)… but that doesn’t seem to be something that exists. I’ll probably throw 2020 on my desktop at home (i7 7820X) and see how that does. It’s 8c/16t, so I imagine it’ll be a bit faster than the i7 7700K… maybe help reinforce my thought that 2020 simply cares about thread count.
I mean… those X5650’s are old but utterly hammered right through renders. Even at a significantly lower clock speed, it was purely more threads and every single one pegged out while running a render.
January 5, 2018 at 12:42 am #165127
Just ran the test on my 7820X – 1m42s… and that was with other stuff running at the time. It really seems like it’s thread count that matters, to me at least.
This made for a rather interesting chart (attached).
i7 7700k (8 threads @ 4.5GHz) – 173s
i7 7820X (16 threads @ 4.2GHz) – 102s
2x X5650 (24 threads @ 2.66GHz) – 44s
So… the highest clock speed (7700K) came in last – makes sense, it has less over-all compute as it has fewer cores. The processor with the highest compute capability (the 7820X) came in at 1.5x the time of the ancient Xeons with more cores but significantly lower clock speeds.
This is why I’m so interested in the 16c/32t Threadripper! Could this trend line be true and yield a ~20s render?
Attachments:You must be logged in to view attached files.
January 6, 2018 at 3:23 pm #165256Jeff CarilliParticipant
Hi Jon I’ll run it, I have a 6950x with all cores running @4.2 I have been wrestling with buying a used <span style=”color: #003c71; font-family: intel-clear, tahoma, Helvetica, helvetica, Arial, sans-serif; font-size: 14px;”>E5-2699V4 which would be a drop in replacement for me just to get the core count up.</span>
June 6, 2018 at 1:56 pm #190361
If it’s worth adding, I came in to an old Precision workstation (R5500) with a pair of Xeon E5620’s in it (2.4Ghz, 4c/8t each). Ran the test render in 1m28s, which actually lines up pretty darn well against that curve line I had drawn.
Still haven’t bit the bullet to buy a Threadripper yet, however my wife’s workload is expected to increase and I’m still debating if it’s worth doing or not.
July 18, 2018 at 11:01 pm #195494The Laminex Group2020 Partner
From recent experience the AMD Ryzen 7 1700 Eight-Core Processor falls short in comparison to Intel(R) Core(TM) i7-6700HQ CPU @ 2.60GHz (current test running Max High Quality perspective rendering in 11.10, 7+minutes vs 1-2 minutes)
Noted in thread that it was recommended to go Intel i7, not far from the truth in light of recent experience.
July 19, 2018 at 8:28 pm #195519
Yeah, I’d love to support AMD but I’m afraid the claims just don’t seem valid in a working environment (or at least as far as 2020 Design is concerned). About 4 or 5 years back, I watched an i5 Surface Pro absolutely thrash a top of the range (for the time) AMD Tower which had been overclocked! The Surface Pro rendered a scene in less than half the time. I can’t remember the specs of the AMD system but I know it was touted as ‘state of the art’ and cost it’s owner quite a lot more than the Surface Pro.
This was doubly surprising in that as far as I remember, the Surface Pro used the integrated Intel graphics and the AMD had a separate high spec Radeon card.
If money was no object, I’d love to build a 56 core dual Xeon Desktop to see how good it would be. Maybe if I did the lottery…
September 10, 2018 at 12:09 am #202511
Just did some comparative Panoramic 360 renders on a reworked HP server.
My base system is a Toshiba notebook with an i7-4710HQ at 2.5GHz. This did a Panoramic 360 at standard res in 1 minute 13 seconds. This is a 4 core processor.
The HP server is running dual Xeon E5520s at 2.27GHz (8 cores) and took 3 minutes 10 seconds!!! Admittedly the E5520s are nearly 10 years old but that is still a heck of a difference.
So later Intel processors are a LOT better at doing this kind of work.
Now if only I could afford a dual i9-7980XE system – 36 cores at 4.2GHz!!
September 11, 2018 at 6:50 pm #203050Mike CookParticipant
<p class=”MsoNormal”>I have an unusual question; can 2020 be installed on a usb-3 thumb drive which can then be plugged into any computer with the 2020 dongle to run 2020? I don’t know if the program installs items in other than the 2020 program folder.</p>
September 11, 2018 at 7:27 pm #203060
I did used to run 2020 V9 as a mobile application and I am about to test making V11.10 mobile but I don’t expect it to run reliably if at all. Because Windows applications scatter files everywhere, there are lots of issues that arise when you try and make an application mobile.
There are a number of applications out there to virtualise other applications. I am trying VMWare ThinApp but there is also Turbo Studio and others.
I’ll post on how I get on.
September 11, 2018 at 7:39 pm #203061
Never tried, but I wouldn’t expect it to go well. Have you considered installing it on a “server” and then just giving permissions to other pc’s to remote in? You could use windows remote desktop connection, or even a service like team viewer or gotomypc.
Probably safer than moving the dongle around all the time anyway.
September 11, 2018 at 8:03 pm #203065
I did consider this method of giving my Designers access to very high spec PCs to render designs as quickly as possible. You could have maybe 3-4 machines supporting 20+ Designers. It’s still something I’d like to play with and I’ve rolled it out in a basic way at one branch where I have a decent spec i7 system supporting a 55″ 4K TV for the client presentation – it works pretty well but I’d like to get an i9 in there.
I understand where Mike Cook is coming from though as it would be nice to be able to ‘hop on’ any PC and run 2020 without the hassle of installing the package with catalogues etc.
For the future, I think that applications will ALL move to a ‘docker’ environment where the app remains completely independent of the hardware running it. Docker apps run a lot more quickly than virtualised apps as they don’t have to ‘simulate’ aspects of the host hardware.
Imagine a time in the near future where almost every system you come across supports docker along with USB3.1 or Thunderbolt. You could walk in with a single USB 3.1 Flash drive (assuming software licensing for 2020 Design), plug it in and off you go at super fast speed.
It’s interesting that Redway who make the rendering module for the 2020 Design package were (at one point anyway) looking at a distributed rendering environment where an application could steal processor cycles from other hardware. I’m not sure if this would be restricted to local hardware or cloud and I’d have some concerns as to whether this could actually slow things down given the extreme speed of modern GPUs
Ideally, the 2020 render engine should be set up to take advantage of a setup equivalent to bitcoin mining rigs which use machines with (for example) 10 top end NVidia cards to superspeed calculating operations. In this way, you could have 1 machine set up like this and as soon as a designer needs a very high res render, 2020 Design would throw the render requirements at the rendering server and get the finished render back in just a few seconds. If the render is still too slow then add a few more NVidia cards!!
September 12, 2018 at 9:23 am #203154
WoW! Thanks Mike. Some very cool ideas and info in there.
@otherMike 🙂 – 2020 installs to both Program Data and Program files as well as writing to the Windows Registries. It might be possible to convince it to run from a USB stick but it would not be a quick or easy process to make it happen and I don’t know how reliably it would work.
September 13, 2018 at 7:27 pm #203656
Just FYI, I tried the VMWare Thinapp approach and it works so long as you only try and run it on a Windows 7 machine (similar to the one it was created on). If you then try the same app on Win10 it fails. I’ve no doubt that with a bit of time, I could sort this out but I’m flat out on other things atm.
To address the way you are doing it using VMWare Workstation, I use this method for running Catalog Tools (V9) and have tried it with V10 and it works very well. There are a few things to do to make it as smooth as possible.
1) Start with a new Win10 VM config. After installing the OS, debloat it as much as possible or try and pick up a ‘lite’ version of the OS. There are some excellent debloat scripts around: . Once you have used the debloater script, go in and uninstall any games and software you don’t want that the script has missed.
2) When defining your VM hardware specifications, use a minimum of 4GB of RAM and 50GB of disk space as well as at least 2 processors. You can change this afterwards depending on the system you will run it on.
3) Install V11 as normal. Once installed, remove any unnecessary catalogues to keep the footprint small.
4) When you run the VM up after installing V11, make sure that you pass the USB security key (if used) to the VMWare session – Got to Player, Removable Devices, click on the Rainbow key (or whatever it shows up as) and click Connect.
And that’s pretty much it!
You may want to consider trying to find ‘portable’ versions of VMWare Workstation to VirtualBox so that you can have EVERYTHING on the SSD and not have to install the VMWare Workstation on a machine prior to using V11.
Feel free to email me at [email protected] if you have any questions.
You must be logged in to reply to this topic.