This article reviews the new book, The AI Playbook, by my colleague here at The Machine Learning Times, Executive Editor Eric Siegel.
Free book: Come to Machine Learning Week – June 4-7, 2024 in Phoenix, AZ – to meet author Eric Siegel, the event founder, and receive a complimentary copy.
–Luba Gloukhova, Editor-in-Chief
Eric Siegel’s “The AI Playbook” serves as a crucial guide, offering important insights for data professionals and their internal customers on effectively leveraging AI within business operations. The book, which comes out on February 6th, and its insights are captured in six statements:
— Determine the value
— Establish a prediction goal
— Establish evaluation metrics
— Prepare the data
— Train the model
— Deploy the model
In this article, I’ll unpack Eric Siegel’s recommendations for managing data projects. While seasoned data practitioners may find many familiar, they serve as valuable anchors for best practices and should ensure the next generation, including those that are experimenting with generative AI, understand the lessons learned by vanguard CIO, CDOs, and data practitioners.
Determine the Value
Siegel emphasizes in the book the need to define a clear value proposition for data initiatives and products. This means determining upfront what the data combined with data models will predict and how these will specifically enhance business operations and customer offerings. Concrete objectives, such as reducing customer churn or cutting manufacturing costs by specific percentages, are vital. Furthermore, organizations must outline the strategies for achieving these business outcomes.
Establish the Prediction Goal
For data practitioners, establishing a detailed prediction goal is crucial, marking the confluence of technology and business strategy. Siegel believes this must be specified in detail. Doing this well demands collaboration and input from business leaders. The goal is to translate business objectives into clear technical actions. The reason for this is that this is effective data initiatives intersect technology and business.
It’s also essential to set achievable expectations for precision. This involves determining the predicted outcomes and planned responses. Data practitioners must enlist business leaders to weigh in. If they succeed, this exercise will translate the business intention into well-defined requirements for technical execution. Here, Siegel claims it is critical to define realistic expectations on precision. This involves defining what’s predicted and what is to be done with it.
Establish Evaluation Metrics
Once what machine learning will predict is defined, data practitioners should shift their focus to the quality of a machine learning model’s predictions. Benchmarking a model’s effectiveness doesn’t require understanding its mechanics. Siegel recommends measuring performance with lift (a factor quantifying prediction improvement over guessing ) or cost (which considers the impact of false positives and negatives, rather than just accuracy, which is the rate of correct predictions).
In the pre-big data era, decisions were often made based using gut feeling or simple linear extrapolation from backward-facing data. With more and current data available, using lift makes sense for comparison since it measures the improvement a model provides over random guessing. Metrics are crucial for assessing training and the operational phases of a model. The value of an imperfect prediction is in its utility, such as predicting customer behavior in marketing campaigns. Siegel shows here a profitability curve illustrating that neither contacting every lead nor none is optimal; instead, it’s about finding the profitable middle ground.
Prepare the Data
Next, Siegel emphasizes the primacy of data over algorithms, highlighting that the main return on investment comes from data. It’s long been understood that data is the cornerstone of predictive strength. Siegel shares an important point — existing data wasn’t created for machine learning or generative AI, suggesting that data preparation is often overlooked and undervalued. Building on this, there’s a call for organizations to resolve data issues and undertake what Stephanie Woerner at MIT-CISR refers to as “industrialization” of data. Firms that achieve this “combine data collected from customer interactions and elsewhere to become a single source of truth that anyone with permission in the firm can use for decision making.”
Siegel underscores the importance of data in driving machine learning outcomes, indicating that most of a project’s effort lies in data preparation, typically the realm of data engineers. According to Jennifer Redmon, “New data science graduates have a false sense of security that the data they receive will be sound.” Effective data prep aims to create comprehensive datasets that are extensive (long), encompassing a wide array of representative scenarios, detailed (wide), providing rich information across variables, and well-organized (labeled).
Train the Model
Siegel highlights the importance of validating a model’s sensibility and debugging it before data practitioners and business users learn from data. Here, it is critical to determine what you can learn from the data. Does it create unexpected outcomes? Siegel aims, at the same time, to demystify data science for novices, making it accessible to business users. However, there needs to be more clarity on how traditional data science models differ from generative AI models, which may have distinct considerations and applications.
Deploy the Model
Siegel emphasizes that deploying a model — moving it from development to practical use — demands organizational commitment, involving not just executives, but also operational staff. As CIOs who I know would insist, data leaders must ensure staff are prepared for the changes that come with new digital processes. Resistance to change is a significant hurdle; thus, altering legacy processes is crucial. As noted by the CIO of the American Cancer Society, buy-in from all levels, especially those directly affected, is essential. At the end of the day, digital transformation changes what someone does or doesn’t do in their day. Deployment, if well-planned and executed, can be the most substantial phase of a data project, altering daily routines and workflows.
Parting Words
Siegel’s book serves as a reminder of the key stages in a data-centric digital transformation. Every successful project should go through the steps suggested by Siegel. While initial planning is vital, engaging mid-level stakeholders is equally important to ensure buy-in throughout the organization. The example of Kodak illustrates the risks of resistance from middle management, even when a company holds valuable patents, as in the case of digital cameras. For data practitioners, adhering to Siegel’s outlined steps is advisable for successful data transformation. For those who want to hear Siegel speak, he will be a keynote speaker at #CIOChat Live 2024!
You must be logged in to post a comment.
The Machine Learning Times © 2020 • 1221 State Street • Suite 12, 91940 •
Santa Barbara, CA 93190
Produced by: Rising Media & Prediction Impact
This forum is amazing and there is a lot of useful content here driving directions. Companies can use this content to further improve the quality of disposable nitrile gloves although they have not received any complaints about them yet. However, there is still room for improvement.
Thankfully, this information provides important reminders for data professionals, covering key aspects of granny sexdoll data management, privacy protection, and technology innovation.
An overview of artificial intelligence, including its definitions, types, and applications in various industries. This section may provide insights into how AI is transforming businesses and the role of data professionals in AI-driven initiatives.
It is worth noting that for readers who are already deeply immersed in the AI field, some of the material may seem elementary. Nevertheless, overall, The AI Playbook is a valuable resource, as well as a https://melbet-ke.com/how-to-download-the-melbet-app-kenya-detailed-guide/ worth considering for those looking for a practical guide to using artificial intelligence in data science.
我们的Essay代写服务以高质量和专业性为核心,注重逻辑结构、论证的严谨性和学术语言的准确性。每一篇Essay https://www.essayghostwriting.com/ 在撰写过程中,都会经过严格的审查和质量控制,确保在深度和广度上满足学术要求。我们深知Essay写作不仅仅是拼凑资料,而是需要通过深度分析、独立思考和清晰的逻辑表达,才能使文章富有说服力和学术价值。