[ad_1]
Graphics card company Nvidia now offers a solution for smaller businesses and research teams that need computing power for machine learning processes or data science workloads. According to the announcement, the Nvidia DGX Station A100 workgroup server has an “AI performance” of 2.5 petaflops and up to 320 gigabytes of GPU (graphics processing unit) memory. It is four times faster than previous generations of DGX stations.
The workgroup server also supports Nvidia’s multi-instance GPU technology. This means that a station can run up to 28 separate GPU instances in parallel or support multiple users without affecting system performance.
“Teams of data scientists and artificial intelligence researchers can accelerate their work by using the same software stack as Nvidia DGX A100 systems so they can easily scale from development to deployment,” said Charlie Boyle, vice president and general manager of DGX Systems at Nvidia.
Normal electricity and cooling needs
The DGX Station A100 requires no power or cooling as a data center and can therefore be easily connected to the office or laboratory. However, it still has the same remote management functions as Nvidia DGX-A100 systems for data centers. System administrators should be able to easily perform all administrative tasks over a remote connection.
“The DGX Station A100 takes AI out of the data center with a server-class system that can be plugged into anywhere,” says Boyle.
You can add the data center version GPU storage capacity
Nvidia DGX-A100 data center systems that were already sold prior to the DGX-A100 station will also receive an optional upgrade, according to the announcement. The original systems are now available in the standard version with 80 gigabyte GPUs instead of the previous 40 gigabytes. By doubling the GPU storage capacity to 640 gigabytes per system, accuracy when working with larger data sets and models should be increased.
The 80 gigabyte graphics processing unit. (Source: Nvidia)
The DGX-100 station and older DGX-100 systems with 640 gigabytes of GPU memory are expected to be available worldwide this quarter from Nvidia partner network resellers. An upgrade option is available for 320 gigabyte Nvidia DGX A100 customers. A price is not yet known. According to “Engadget.com”, the original DGX-100 systems cost $ 199,000 US at launch.
In mid-September it became known that Softbank was selling British chip designer ARM to Nvidia. Read more about this $ 40 billion deal here.
Source link