We see the hype and buzzword about artificial intelligence (AI), and everybody wants to engage with AI somehow, not just as a user but as a contributor to AI.
It’s also justified hype, but AI is not a newly invented topic. A niche group of number-loving bright individuals, such as Alan Turing (Mathematician), John McCarthy (computer scientist), Arthur Lee Samuel (who popularized “Machine Learning” and “Artificial Intelligence” terminologies through gaming), George Cybenco (Cyber Security Researcher), and a few more brilliant souls, have already spent a lot of time exploring AI, even though we may not have been born yet.
Back then, AI was more of a mathematicians’ and statisticians’ game than of a Data scientist (as we call them today). While the concept of AI was not new, the ease of performing AI workloads was not a thing.
Data was not abundant, as we do today (generating billions of data across the globe daily), and the hardware was not conducive to conducting such experiments. These were called experiments rather than really AI model training and testing.
Mathematicians could go up to classic Machine learning techniques. Still, Deep learning needed more efficient & “worthy enough” hardware, so improving AI performance & accurately predicting an outcome was highly challenging. Eventually, this foundational lack of hardware brought winter to the AI revolution until New Morning arrived in 1980(s), but it still belonged to a niche enthusiast.
With the advent of (rather the proliferation of) high-speed internet from 2000 onwards and the rise of social media platforms, we started sharing images, views, and comments, which gave rise to the deluge in data availability that supports any statistical analysis irrespective of Classic Machine Learning or Deep Learning techniques. It was further good news.
Still, the hardware had to improve the processing of such magnitude of data & statistical modelling. Today, when we talk about AI, we do not speak about Researcher support; we refer to User Experience (UX), which is how user-driven the technology market today is. Hence, the speed & accuracy of AI outcomes are imperative. A researcher may wait for one day to see the results. Still, a user would wait an average of 20 seconds before losing interest in the product & uninstalling & even giving an unsavoury product review.
Training the model with a historical dataset is crucial for a well-performing AI application. However, it takes a lot of time (even days) to serve the model to customer-facing apps.
To reduce this latency and keep up with the speed of UX expectations, Hardware companies (especially Semiconductor /Chip) Industries started innovating at the speed of light and with all their might.
Different processing units/chips came into the market for different AI /ML model workloads.
CPU (Central Processing Units):
Even though initially, the CPU was considered a cheaper option with slow or lighter workloads & not an excellent match for AI efforts, CPU efficiency evolved to show its strength.
Below are some significant ones:
- Intel Core Ultra and Xeon are multi-core CPU ranges built for AI workloads with deep learning needs.
- AMD also offers AI-capable processors such as Epyc.
- Even though the above are suitable for high input-output operations, other lower Intel series CPUs) work with smaller datasets & lighter workloads.
GPU (Graphical Processing unit):
While CPU(s) are evolving and becoming powerful enough to meet AI needs, the entry of GPUs makes the scenario more interesting, especially in workloads with extensive rendering efforts and deep learning network training, such as image or voice processing.
- NVIDIA GeForce RTX series. Necessary for training neural network models.
If one can stretch the budget or build a product, there will be further improvements in the field of niche processing units for AI.
FPGA (Field Programmable Gate Array): low power, programmable for AI workload. MS Bing uses FPGA.
ASIC (Application Specific IC): This type of IC is good for scaling in real-life Operations. Tensor Processor Unit and Neural Processing Unit are examples of such ICs.
We expect to see a lot more revolution in this area very shortly & waiting to be surprised! It’s a magical era of AI.
Comments (0)