TensorFlow

[11] TensorFlow can be used in a wide variety of programming languages, including Python, JavaScript, C++, and Java,[12] facilitating its use in a range of applications in many sectors.

[13][14] Google assigned multiple computer scientists, including Jeff Dean, to simplify and refactor the codebase of DistBelief into a faster, more robust application-grade library, which became TensorFlow.

[15] In 2009, the team, led by Geoffrey Hinton, had implemented generalized backpropagation and other improvements, which allowed generation of neural networks with substantially higher accuracy, for instance a 25% reduction in errors in speech recognition.

[citation needed] Its flexible architecture allows for easy deployment of computation across a variety of platforms (CPUs, GPUs, TPUs), and from desktops to clusters of servers to mobile and edge devices.

[23] In May 2016, Google announced its Tensor processing unit (TPU), an application-specific integrated circuit (ASIC, a hardware chip) built specifically for machine learning and tailored for TensorFlow.

A TPU is a programmable AI accelerator designed to provide high throughput of low-precision arithmetic (e.g., 8-bit), and oriented toward using or running models rather than training them.

Google announced they had been running TPUs inside their data centers for more than a year, and had found them to deliver an order of magnitude better-optimized performance per watt for machine learning.

[citation needed] In May 2018, Google announced the third-generation TPUs delivering up to 420 teraflops of performance and 128 GB high bandwidth memory (HBM).

[32] Other major changes included removal of old libraries, cross-compatibility between trained models on different versions of TensorFlow, and significant improvements to the performance on GPU.

[36] This distributed computing can often speed up the execution of training and evaluating of TensorFlow models and is a common practice in the field of AI.

[40] Some of these operations include variations of convolutions (1/2/3D, Atrous, depthwise), activation functions (Softmax, RELU, GELU, Sigmoid, etc.)

[40] TensorFlow offers a set of optimizers for training neural networks, including ADAM, ADAGRAD, and Stochastic Gradient Descent (SGD).

[48][12] Third-party language binding packages are also available for C#,[49][50] Haskell,[51] Julia,[52] MATLAB,[53] Object Pascal,[54] R,[55] Scala,[56] Rust,[57] OCaml,[58] and Crystal.

[66] Numpy NDarrays, the library's native datatype, are automatically converted to TensorFlow Tensors in TF operations; the same is also true vice versa.

The primary functions of JAX are:[71] GE Healthcare used TensorFlow to increase the speed and accuracy of MRIs in identifying specific body parts.

[74] Google used TensorFlow to create DermAssist, a free mobile application that allows users to take pictures of their skin and identify potential health complications.

[75] Sinovation Ventures used TensorFlow to identify and classify eye diseases from optical coherence tomography (OCT) scans.

[79] TensorFlow was used to accurately assess a student's current abilities, and also helped decide the best future content to show based on those capabilities.

[75] The cosmetics company ModiFace used TensorFlow to create an augmented reality experience for customers to test various shades of make-up on their face.