San Francisco: Electric vehicle company Tesla has released a new white paper regarding a new standard for its Dojo supercomputing platform.
CEO Elon Musk teased the paper as “more important than it may seem”, reports Electrek.
In the abstract, the automaker is describing a new standard to work with its computing platform.
“This standard specifies Tesla arithmetic formats and methods for the new 8-bit and 16-bit binary floating-point arithmetic in computer programming environments for deep learning neural network training,” according to the report.
“This standard also specifies exception conditions and the status flags thereof. An implementation of a floating-point system conforming to this standard may be realized entirely in software, entirely in hardware, or in any combination of software and hardware,” it added.
For years now, Tesla has been teasing the development of a new in-house supercomputer optimised for neural net video training.
Tesla is handling an insane amount of video data from its fleet of over 1 million vehicles. It uses this video data to train its neural nets.
Over the last two years, Musk has been teasing the development of Tesla’s own supercomputer called “Dojo”.
Last year, he even teased that Tesla’s Dojo would have a capacity of over an exaflop, which is one quintillion (1018) floating-point operations per second or 1,000 petaFLOPS.
The automaker already has its Dojo chip and tile, but it is still working on building its full rack to create the supercomputer.