Microsoft creates deep, technical content to help developers enhance their proficiency when building solutions using the Azure AI Platform. Our preferred training partners redeliver our LearnAI Bootcamps for customers around the globe on topics including Azure Databricks, Azure Machine Learning service, Azure Search, and Cognitive Services. Umanis, a systems integrator and preferred AI training partner based in France, has been innovating in Big Data and Analytics in numerous verticals for more than 25 years and has developed an effective methodology for guiding customers into the Intelligent Cloud. Here, Philippe Harel, the AI Practice Director at Umanis, describes this methodology and shares lessons learned to empower customers to do more with data and AI.
2019 is the year when artificial intelligence (AI) and machine learning (ML) are shifting from being mere buzzwords to real-world adoption and rollouts across the enterprise. This year reminds us of the cloud adoption curve a few years ago, when it was no longer an option to stay on-premises alone, but a question of how to make the shift. As you draw up plans on how to best use AI, here are some learnings and methodologies that Umanis is following.
Given the ever-increasing speed of change in technology, along with the variety of sectors and industries Umanis works in, they focused on building a methodology that could be standardized across AI implementations from project to project. This methodology follows an iterative cycle: assimilate, learn, and act, with the goal of adding value with each iteration.
The Azure platform acts as an enabler of this methodology as seen in the image below.
In most data and artificial intelligence (AI) projects implemented at Umanis, several trends are gaining momentum and are likely to amplify in 2019:
- More unstructured, big, and real-time data.
- An increased need for fast and reliable AI solutions to scale up.
- Increasing expectations from customers.
In this blog post, we will explain how you can address these kinds of projects, and how Umanis maps their approach to the Azure offering to deliver solutions that are easy to use, operationalize, and maintain.
The 3 phases of the AI implementation methodology
In this initial phase, you can be hit by anything. From the good to the big, bad, and ugly: databases, text, logs, telemetry, images, videos, social networks, and more are flowing in. The challenge is to make sense of everything, so you can serve the next phase (Learn) successfully. By assimilating, we mean:
- Ingest: The performance of an algorithm depends on the quality of the data. We consider “ingesting” to be checking the quality of the data, the quality of the transmission, and building the pipelines to feed the subsequent parts.
- Store: Since the data will be used by highly demanding algorithms (I/O, processing power) that will mix data from various sources, you need to store the data in the most efficient way for future access by algorithms or data visualizations.
- Structure: Finally, you’ll need to prepare the data for an algorithms’ consumption and execute as many transformations, preprocessing, and cleaning tasks as you can to speed up the data scientists’ activities and algorithms.
This is the heart of any AI project: Creating, deploying, and managing models.
- Create: Data scientists use available data to design algorithms, train their models, and compare the results. There are two key points to this:
- Don’t make them wait for results! Data scientists are rare resources and their time is precious.
- Allow any language or combination of languages. On that perspective, Azure Databricks is a great solution as it addresses this natively by allowing different languages to be used in a single block of code.
- Use: Once algorithms are deployed as APIs and consumed, the need for parallelization goes up. SLAs and testing the performance of the sending, processing, and receiving pipeline is crucial.
- Refine: Refining the quality of algorithms ensures reliable results over time. The easy part of this activity is automatic re-training on a regular basis. The less obvious one is what we call the “human in the loop” activity. In short, a Power BI report showing the results of predictions that a human can re-classify quickly as needed, and the machine uses this human expertise to get better at its task.
All of the above phases are useless unless you actually make good use of the algorithm’s added value.
- Inform: Any mistake in code, misunderstanding in requirements, or bug can be devastating as first user impressions are crucial. Therefore, instead of a “big bang” of visualizations, start very small, iterate very quickly, and make a few key users on-board to secure adoption before widening the audience.
- Connect: Systems that use the information from algorithms need to be plugged in. This is called RPA, IPA, or automation in general, and the architectures can vary greatly on each project. Don’t overlook the need for human monitoring of this activity. Consider the impact of the most wrong answer from an algorithm, and you will get a good feel of the need for human supervision.
- Dialog: When dealing with human interaction, so much comes into play that to be successful, the scope of the interaction needs to be narrowed down to the actions that really add value and are not trivial. (This is not easily possible via classic interfaces.)
This methodology will certainly change and adapt overtime. Nevertheless, Umani has found it to be a robust way of rolling out end-to-end data and AI projects while minimizing friction and risk. By using this approach to present a Data & AI project to both customers and internal teams, everyone can get a good feeling of what activities, technologies, and challenges are involved. It’s one way to address the “Urgent need to build shared context, trust, and credibility with your team” as Satya Nadella states in his book, Hit Refresh. This methodology, is a great way to build trust in your relationships.
Learn more about the Azure Machine Learning service
Get started with a free trial of Azure Machine Learning service
Source: Azure Blog Feed