Skip to content

Explore Courses | Elder Research | Contact | LMS Login

Statistics.com Logo
  • Courses
    • See All Courses
    • Calendar
    • Intro stats for college credit
    • Faculty
    • Group training
    • Credit & Credentialing
    • Teach With Us
  • Programs/Degrees
    • Certificates
      • Analytics for Data Science
      • Biostatistics
      • Programming For Data Science – Python (Experienced)
      • Programming For Data Science – Python (Novice)
      • Programming For Data Science – R (Experienced)
      • Programming For Data Science – R (Novice)
      • Social Science
    • Undergraduate Degree Programs
    • Graduate Degree Programs
    • Massive Open Online Courses (MOOC)
  • Partnerships
    • Higher Education
    • Enterprise
  • Resources
    • About Us
    • Blog
    • Word Of The Week
    • News and Announcements
    • Newsletter signup
    • Glossary
    • Statistical Symbols
    • FAQs & Knowledge Base
    • Testimonials
    • Test Yourself
Menu
  • Courses
    • See All Courses
    • Calendar
    • Intro stats for college credit
    • Faculty
    • Group training
    • Credit & Credentialing
    • Teach With Us
  • Programs/Degrees
    • Certificates
      • Analytics for Data Science
      • Biostatistics
      • Programming For Data Science – Python (Experienced)
      • Programming For Data Science – Python (Novice)
      • Programming For Data Science – R (Experienced)
      • Programming For Data Science – R (Novice)
      • Social Science
    • Undergraduate Degree Programs
    • Graduate Degree Programs
    • Massive Open Online Courses (MOOC)
  • Partnerships
    • Higher Education
    • Enterprise
  • Resources
    • About Us
    • Blog
    • Word Of The Week
    • News and Announcements
    • Newsletter signup
    • Glossary
    • Statistical Symbols
    • FAQs & Knowledge Base
    • Testimonials
    • Test Yourself
Student Login

Blog

Home Blog Why Analytics Projects Fail – 5 Reasons

Why Analytics Projects Fail – 5 Reasons

With the news full of so many successes in the fields of analytics, machine learning and artificial intelligence, it is easy to lose sight of the high failure rate of analytics projects. McKinsey just came out with a report that only 8% of big companies (revenue > $ 1 billion) have successfully scaled and integrated analytics throughout the organization. In some ways, the very notable successes of analytics and data science contribute to the high failure rate, as ill-prepared organizations flock to implement projects. There are various reasons for failure, and all are instructive.

A respected voice in data science, John Elder, shared his experience a number of years ago with a popular lecture called “The Top Ten Mistakes in Data Mining.” When it first came out, technical statistical errors ranked high on the list – mistakes like overfitting the data used to train a model, ignoring outliers, failing to try multiple methods, and accepting leaks from the future. As time passed and analytics projects became widespread in many different organizations, attention has shifted to the organizational climate surrounding the implementation of these projects.

 

Here is a list of 5 principal reasons organizational projects fail:

1. Shiny toy syndrome: The “shiny toy syndrome” occurs when top leadership sees other organizations implementing analytics and data science projects, and wants a project of their own. This leads to a number of problems:

  • Top management’s attraction is to the shininess of the toy, not to a broader analytics strategy.

  • A clearcut problem to solve, and analytics objective, are typically missing.

  • The rest of the organization sees that top management’s attention is shallow, and not likely to last long enough to make sustained coordination and effort worthwhile.

 

2. Focus on vendors’ tools – the “plug-and-play” illusion: Analytics is known to require sophisticated software and algorithms, so vendors with these tools find a ready reception. The problems?

  • The software vendor is not usually in a position to define the organizational problem.

  • The result is a tool in search of a problem.

  • A further outcome is disillusionment with analytics, since no organizational problems were solved and the organization is left with an expensive but unused tool.

3. Rely solely on specialized team of super-experts: It is well known that successful deployment of analytics requires expertise in the disciplines that make up the field of data science – statistics, computer science, IT. The problem comes when a company assembles such a team, but fails to integrate it with the rest of the company.

  • Strategic direction from the top will be lacking, and there is no assurance that important problems will be worked on.

  • The team’s elegant and technically advanced models do no good if they don’t deliver solutions to problems that need to be solved, in harness with the rest of the organization.

 

4. Fail to spread analytics education broadly: If analytics is to escape its silo and become a strategic focus of the organization, some basic analytics education must be spread broadly throughout the organization. If the various functional units do not know what analytics and data science can do, it is difficult to get everybody pulling in the same direction.

  • Strategic leaders, not really understanding analytic concepts and details, will fail to identify and frame useful problems.

  • Data gatekeepers may impede access to data.

  • Useful knowledge about the data from functional and domain experts may remain unexplored.

  • Those at the deployment end may not trust the analytics, and carry on with the old ways.

5. Technical reasons: The technical errors that John Elder identified over a decade ago in his “Top 10 Mistakes” lecture remain common. One is overfitting the data, or “fitting the noise, not the signal.” A related error is running huge numbers of model variations, and believing the best model. In both cases, some element of random chance is being misinterpreted as something interesting (a common phenomenon – there is a human predilection to latch on to seemingly meaningful chance patterns). Other technical errors can actually be mitigated by avoiding the first four errors mentioned above. For example:

  • Mistaken inclusion of future information in training a prototype model can be avoided if domain experts are better trained in analytics and included in the process.

  • Unwarranted extrapolation beyond the range of a model can be avoided in the same way.

  • Technical experts often turn a poorly specified scenario into a problem they can solve, and yet the solution has little meaningful utility; this can be avoided if top decision-makers who set strategy are better versed in analytics.

Recent Posts

  • Oct 6: Ethical AI: Darth Vader and the Cowardly Lion
    /
    0 Comments
  • Oct 19: Data Literacy – The Chainsaw Case
    /
    0 Comments
  • Data Literacy – The Chainsaw Case
    /
    0 Comments

About Statistics.com

Statistics.com offers academic and professional education in statistics, analytics, and data science at beginner, intermediate, and advanced levels of instruction. Statistics.com is a part of Elder Research, a data science consultancy with 25 years of experience in data analytics.

 The Institute for Statistics Education is certified to operate by the State Council of Higher Education for Virginia (SCHEV)

Our Links

  • Contact Us
  • Site Map
  • Explore Courses
  • About Us
  • Management Team
  • Contact Us
  • Site Map
  • Explore Courses
  • About Us
  • Management Team

Social Networks

Facebook Twitter Youtube Linkedin

Contact

The Institute for Statistics Education
2107 Wilson Blvd
Suite 850 
Arlington, VA 22201
(571) 281-8817

ourcourses@statistics.com

  • Contact Us
  • Site Map
  • Explore Courses
  • About Us
  • Management Team

© Copyright 2023 - Statistics.com, LLC | All Rights Reserved | Privacy Policy | Terms of Use

By continuing to use this website, you consent to the use of cookies in accordance with our Cookie Policy.

Accept