Today ML Commons announced new results from two industry-standard MLPerf benchmark suites: Training v3.0, which measures the performance of training machine learning models, and Tiny v1.1, which measures how quickly a trained neural network can process new data for extremely low-power devices in the smallest form factors. To view the results and to find additional […]
SK Telecom Doubles Capacity of ‘Titan’ Supercomputer for ChatGPT-like Language Model
SK Telecom of South Korea has boosted the power of its “Titan” AI supercomputer by roughly 100 percent by increasing its capacity to 1,040 NVIDIA A100 GPUs A story on the Korean news site Pulse stated that SKT announced on Sunday the news on the supercomputer, which powers of the company’s artificial intelligence model, called […]