Building an AI factory
Our world lives up to software. Software is undergoing a significant transformation, and the AI era is taking its very first days. We all agree with its importance. After all, it is a recent development in human history and is fundamental to our Information Age.
Its history dates from the 19th century until today's reality, where our lives purely rely on it. The software runs millions of transactions daily in our banking infrastructure. It manages vast amounts of real-time data and interactions among billions of users. It instantly sets the prices of the products you order.
Until now, all the products and systems that revolutionized our way of living are at the risk of being disrupted by AI, and that can drive the explosive growth of any newcomer, which isn't even all that sophisticated.
AI doesn't need to be sci-fi. We can classify it into two types of AI:
- “Strong AI” - simulations of human reasoning, intelligence, and consciousness;
- “Weak AI” - capable of handling tasks traditionally handled by people - answering questions, playing chess, etc.
Most of the software we interact with is “Weak AI,” which can take on many critical decisions already. Let's take the example of self-driving cars, amazon’s warehouse robots, etc. It's clear nowadays that most digital decision factories handle some of the most critical processes and operating decisions. It's as if software is moving humans to the edge of the decision-making at companies. It's not just now. It has been for a while already.
How do companies go to build their AI system?
There are 4 essential components to any AI system. We can think of your company as a factory with suppliers of raw materials, a production line, and distribution centers. To produce an AI algorithm capable of outputting something intelligently, the data must come as raw materials pass through research and experimentation, get into the production line to transform, and finally get distributed to get used.
- Data Pipelines: Raw material (aka data) comes in its pure forms. You can picture data pipelines as the efficient highways of information. They connect diverse data sources, acting as a sophisticated network. Their main job? To gather, clean, combine, and protect your data. Think of it as a well-oiled machine, systematically ensuring that data travels safely and swiftly from point A to point B, ready for your business to use.
- Algorithms: Imagine them as intelligent, data-savvy guides. They sift through the information your business collects, using patterns and statistics to forecast future scenarios. These powerful tools don’t just guess; they calculate probabilities and outcomes, helping you anticipate and prepare for what's coming next in your business landscape.
- Experimentation: This is like the testing ground for your new algorithm ideas. Here, you check if these fresh algorithms are hitting the mark. It’s where you put your hypotheses to the test, making sure the algorithms are good and effective in achieving the specific goals you have in mind. This step ensures that their insights and recommendations are truly beneficial for your business strategies.
- Infrastructure: When the product is assembled outside of the production line, the final step is distribution. Here is no different than using a system connecting internal and/or external users to use your “AI.”
In a very simplistic way, this is like a suggestion from Google each time you click on a key to search a term in their search engine. Each key you press will generate many suggestions on their drop-down, predicting your complete search term based on an algorithm that uses your past searching habits and other users like you. With AI, algorithms can now use past data collected to identify patterns in the current usage vs. past usage and generate organic search results. Each click on or away from the search query or results, each key pressed on the keyboard serves as the raw data material to be placed at the beginning of “Google Factory” to train the algorithm to serve better suggestions next time.