The Future of Imaging Part 5 - The Services and MPS Era

Virtulytix Thought Leadership Blog Series – Post 5

 From Typewriters to the Data Era. Service & Commoditization Era

 Ed Crowley, Virtulytix, Chief Thought Leader (CTO)

 In exploring the past – and future of the imaging industry, we have looked at the transition from scribes to mechanical devices (typewriters) to the impact era and most recently the ‘heyday' of the office imaging products, the page era. Through much of the industry's history, progress has been enabled through tangible products and by technology innovations in them to generate documents. It has been an amazing history of innovation with price performance levels increasing at fantastic rates, and features being incorporated at lower or no cost premiums (remember when duplex cost extra?). Of course, the evolution continues.

 At the same time, massive changes in the office environment have always been subject to influences from non-technology forces such as demographic shifts, adoption of new business models such as industrialized manufacturing, and general economic trends. During the most recent era of service and commoditization, changes in the industry were being heavily influenced by all of these factors.

The economic recession of 2008 had a massive impact on driving companies to focus on containing the cost of office equipment technology, accelerating the demand of Managed Print Services (MPS) and the appeal of its much-touted potential of saving up to 30% of a firms office printing and copying costs. The shift of the workforce from baby boomer to millennial dominance is changing the demand for print as the first generation to be raised on the internet, touch screens and mobile devices become the largest portion of the workforce. The millennial generation is the first generation to have the internet available from adolescence forward, in essence, to be ‘raised on the internet’.  Lastly, the shift from ‘transactional ownership’ to ‘pay for use’ is becoming a dominant business model trend.

All of these forces combined to drive two dominant themes during the last era – commoditization and servitization . Increasingly, customers perceive less differentiation between office imaging products as they increasingly offer a similar set of features and similar price performance points. In most cases, they have also far surpassed usage requirements. How often do you see a printer or MFD sitting in an office printing 2,000 or 3,000 pages a month when it is rated as a 50 ppm or greater device? Think about it.  As a 50-ppm device, the printer is capable of printing 480,000 pages a month if it was printing 8 hours a day, 5 days a week, for a full 4 weeks of the month. Data from our Page Volume Index (based on actual operating data for over 20,000 devices spread across a large number of end-customers) tells us that most are operating at below a 3,000 page a month average. That 3,000 pages per month only represent 1% of the capacity of a 50-ppm printer. You might say, that’s an unfair comparison – 50-ppm is not the typical office device. Okay – how about 20-ppm. That’s a pretty common performance point. And it equates to 192,000 pages per month at 100% capacity. Our 3,000 page per month usage only represents 2% of the capacity.

 The point is, most products offer far more performance and, in most cases, capabilities than any customer ever uses. Customers are beginning to realize this. As a result, the plethora of features thrown at the customers is met with a steady yawn. As someone who spent over 20 years of their life as an office products product manager at major manufacturers, this fact of life really hurts. But it is the reality with its reflection in the steady decline of hardware prices as vendors increasingly rely on price leadership to win customers. This is what drives commoditization and the reality is that relying on product feature differentiation commoditization not only is here to stay but will only get worse.

 At the same time, vendors are attempting to move to services to add value and drive revenue from usage versus using a transactional sale model. This has been achieved with varying degrees of success by different resellers and manufacturers. Lexmark attempted to shift to a services-led model with a high software-based value add component but was unable to successfully merge its software and hardware-driven business models. Xerox has led in services for years (one could argue they began with services when they introduced the cost per copy concept in the '60s) but has been struggling in more recent years to show results that can keep their investor base happy.  Some resellers such as M2 in Europe and EO Johnson in the USA have made impressive strides in offering services as a core business model for engaging customers. However, a large portion of the market still struggles to move beyond a cost-per-copy arrangement to offering true management of the customers fleet, a core basic concept of MPS.

 And this leads us to where the industry is evolving now – the data era.  This is where things really start to get interesting. In our upcoming webinar, I will be discussing what this next era looks like, how things are changing, and "what's next".  So, click here (LINK) to sign up for our webinar on Thursday, March 28 at 10:00 am EDT.

 I encourage you to share your thoughts and opinions on the industry’s journey through the Virtulytix website blog, the Virtulytix LinkedIn site or the LinkedIn discussion group at Imaging Industry Transformation.

ExpandedCustomerAdoptionModel (4 Stages).jpg

The Future of Imaging Part 4 - The Page Era

Virtulytix Thought Leadership Blog Series – Post 4

From Typewriters to the Data Era .. The Page Era

Ed Crowley, Virtulytix, Chief Thought Leader (CTO)

In 1984 I was graduating from College (yes – that makes me OLD!).  In that same year, IBM sold 2 million PC’s, Apple Introduced the 128K Mac, and the total PC market (including Atari/Commodare) had grown to a massive 6 million units! The office technology market for PC’s was just beginning to take off (for comparison, by 10 years later, PCs were selling at a rate of 40 million units per year).  In 1984, and the following year 1985, two key office imaging products were introduced that would set the stage for the next 20 years of massive growth. The HP Laserjet and the Apple Laserwriter.

In 1984, when the LaserJet was introduced, typewriters and dot matrix printers dominated the office. However, within a very short period of time, both of these technologies would diminish to the background as ‘old technologies’ with limited relevance in the new office market. The LaserJet and Laserwriter provided high quality and affordable (relatively) printing. And this set the stage for what I would consider to be the ‘golden era’ of the imaging market. These new page printers (called that because they imaged an entire ‘page’ at one time) rapidly evolved to include multiple resident font faces (anybody remember the old font cartridges?), more finishing options, and even ultimately scanning and faxing. And it was ‘personal’. As prices fell, it become reasonable to put a laser printer on an individuals desk. Printers proliferated.

By the early 90’s, photocopiers began to add networking and digital controllers, setting the stage for the real estate grab between copier and printer companies and leading to the ultimate convergence of these technologies where the only difference between ‘copier’ and ‘print’ was A3 vs. A4, and the business model used to sell and service the products.  From a customer’s perspective, the products were increasingly similar.

The growth of this market saw the rise of new firms (HP and Lexmark come immediately to mind) that fought for control of the distributed office imaging market. Competitors who dominated in the dot matrix era (Toshiba and Epson for example) struggled to gain traction with electrophotographic technologies. By the late 90’s, inkjets had become increasingly capable and were moving from the consumer market into the office market. HP, Canon, and Epson fought for control of this market as well as Brother and Lexmark (although Lexmark’s business inkjets failed to gain significant traction in the market).

The real story in this Era was about price / performance evolution. One way to measure price performance is to take the street price of a printer, and divide it by it’s Page Per Minute (PPM) speed. This gives you a measure of the cost per PPM for the device. It makes it a little easier to compare products across different speed ranges. And very importantly, it gives you a way to see how price performance changes over time. As the graph below shows, when the HP LaserJet was introduced, the price was $374 per PPM.  Nine years later, when the TI MicroLaser was introduced, the price per PPM had fallen to $77.  Now, in 2019, the price per page for an HP PageWide Pro 577 has fallen to $9.99 per PPM.  Granted, the PageWide Pro is an inkjet, not a laser – but the point is, for the office imaging products, the name of the game has been better, faster, cheaper. (Can you say – “race to the bottom”?).

In the race for more ‘footprints’ or ‘Machines in Field (MIF)’, the industry actually began selling color printers at a hardware loss in the early 2000’s on the anticipation that the profitability of color supplies would more than make up the difference in the hardware loss. The stage was set for the next era, the era of commoditization and services as the industry began to look for long term, viable business models that didn’t require simply lowering price and offering more features or performance.

 

At the conclusion of this series of five blogs (each blog will cover a distinct era), I will be holding a complimentary webinar on how to thrive in the new Data Era, as well as provide access to a free white paper covering is critical topic. I encourage you to share your thoughts and opinions on the industry’s journey at on this blog, on the Virtulytix LinkedIn site or the LinkedIn discussion group at Imaging Industry Transformation.

Pages Per Minute Comparison

Pages Per Minute Comparison

The Future of Imaging .... Part 3 .. The Impact Era

Ed Crowley, Virtulytix, Chief Thought Leader (CTO)

In the prior two blogs I sped through over 300 years of technology evolution from scribes and clerks through the typewriter era. While I remember (not too fondly) typing papers for College on my Brother typewriter, even by my graduation in 1984, with the advent of the IBM PC and word processing software, high speed 24-pin dot matrix printers were becoming standard office tools and displacing typewriters.  As early as the 1960’s, impact printers were pervasive in large computing centers. A number of technogies were available including line chain (IBM), drum (DataProducts), and line matrix (Printronix). These were the printers which used tractor fed ‘greenbar’ paper (if you know what that is, you are already dating yourself). Their little cousin, the desktop dot matrix printer didn’t really rise to prominence until the late 70’s and early 80’s.

Like the major office technology transitions before impact printers, the print technology was not necessarily the ‘driver’ of the transformation. Rather, demographic (led by a highly educated baby boomer work force), computing (the advent of real desktop PC’s such as the IBM PC and Apple), and software (for example WordPerfect – remember those fun <B> codes for formatting documents?) changes resulted in a need for low cost, digitally driven, desktop printing devices. Hence the rapid rise of the dot matrix printer.

However, compared to the scribe era and pre-electronic / typewriter era, the dot matrix era was actually rather short. The scribe era could arguably be defined as 300 years long. The Pre-electronic / typewriter era lasted 104 years. The dot matrix era lasted 25 years. The rate of change was clearly accelerating, and as we have seen, continues to accelerate today.

The change between eras became more dramatic and disruptive with each succeeding era. For example, the overlap between the Scribe and pre-electronic / typewriter era was lengthy and could be argued to have lasted at least 50 to 75 years. The overlap between when the impact era started and the pre-electronic – typewriter era ended was 35 years. However, the overlap between the end of the impact era (and by end I mean when they shifted from a leading technology to a rapidly declining technology) was only a few years. The first real desktop laser printer was the HP LaserJet, which was introduced in 1984. The same year that the leading printer product (in terms of unit volume) was the Toshiba P24 pin dot matrix printer. However, within two short years, dot matrix printer volumes were in a free-fall as laser technology become the standard.

Shortened ‘life cycle’ spans for technology eras, with less over-lap between the eras, and rapid demographics and complementary office technology changes has been the over-riding trend in the office as we journey from Scribes 300 years ago to the impact era 25 years ago.  And for the most part, with each transition between eras, the leaders in the office products space have changed. The dominant names in typewriters (IBM, Smith Corona) were not the leaders in the impact era. The leaders in the impact era (Toshiba, Centronics, and DataProducts) were not the leaders in the next era – the Page Era. The moral of the story is, when companies become more focused on protecting their existing technology / market leadership position than understanding the external market forces and dynamics shaping the office market – they are vulnerable to rapid and significant technological change.

So what is your company’s position? Are you protecting your existing ‘piece of the pie’? Or are you focused on what is happening externally to your firm and even to the industry? In the next blog I will discuss the Page Era, arguably the ‘golden age’ of the office imaging market when vendors were able to practically print money due to the rapid growth and high profitability of laser and inkjet printers.

 At the conclusion of this series of five blogs (each blog will cover a distinct era), I will be holding a complimentary webinar on how to thrive in the new Data Era, as well as provide access to a free white paper covering is critical topic. I encourage you to share your thoughts and opinions on the industry’s journey at on this blog, on the Virtulytix LinkedIn site or the LinkedIn discussion group at Imaging Industry Transformation.

The Future of Imaging.... Part 2

From Typewriters to the Data Era .. The Typewriter Era

Ed Crowley, Virtulytix, Chief Thought Leader (CTO) 

Have you ever really thought about how disruptive typewriters were. Can you imagine a time when clerks literarily hand wrote letters and kept handwritten entries in accounting journals? And this isn’t the dark ages we are talking about. Right up until the 1860’s, this was the norm.

The scribe era was dominated by a relatively small educated class that could read and write. Illiteracy rates were as high as 90% in the 1500’s falling to 60-70% in the early 1700’s. It wasn’t until the late 1700’s that literacy rates began increasingly significantly as education rates improved, and the industrial era began. During that time society began the first major shift away from a trades-based and agrarian society to the industrial society we know today. These socio-economic changes set the stage for the dramatic technology driven shift that we call the typewriter era.

Christopher Lathem Sholes introduced the first commercially successful typewriter in1867 Suddenly, the legibility of the letter wasn’t dependent upon penmanship. Every letter was consistent in terms of quality, despite the individual skill set of the operator. What a huge change.

At almost the same time, the peak of industrialization and the rise of education levels drove an increasing need for, and availability of, clerical workers along with standardized communication. The typewriter era brought a way for these clerical workers to create business correspondence in a standardized way faster and more accurately than ever before. The beginnings of the modern office were born.

Sholes Typewriter

Sholes Typewriter

In the early 1960’s IBM further revolutionized the office environment by introducing the electric typewriter which included such innovations as the qwerty style keyboard and proportional spacing.  By 1983, typewriters had enjoyed a long and successful run of over 120 years and IBM was the leader in the market with the number one market share position in the USA. Dataquest (a leading market forecast provider of the time) was predicting the typewriter market would more than double by 1987. This forecast was made despite the emerging PC market and the emergence of low-cost dot-matrix printers and ultimately was proven wrong. By 1987 the typewriter market was in a free-fall due to the increasing use of computer like Word Processors from companies like Wang and Lanier along with growing use of desktop personal computers (PCs) from IBM and Compaq. By 1991 IBM would divest itself of its typewriter division in a spin-off that would ultimately become Lexmark (which was acquired 26 years later by the Chinese firm, Apex/Ninestar).

The typewriter era had an incredible run. It created and then changed the modern office. However, it was ultimately displaced by new, emerging technologies. And no one saw it coming. This was the end of what we call the first modern office era - the Typewriter era. It lasted 124 years compared to over 300 years for the scribe era.

In this transition we see a foreshadowing of the increasing rate of change and disruption that will happen over the next thirty five years enabled by technology advances and changing socio economic trends. In the next blog I will walk us through the ‘Impact Era’ which overlapped with the typewriter era, but had a relatively short peak as the market rapidly shifted to page printers.

At the conclusion of this series of five blogs (each blog will cover a distinct era), I will be holding a complimentary webinar on how to thrive in the new Data Era, as well as provide access to a free white paper covering is critical topic. I encourage you to share your thoughts and opinions on the industry’s journey at on this blog, on the Virtulytix LinkedIn site or the LinkedIn discussion group at Imaging Industry Transformation.

What Xerox Should Really Be Worried About

What Xerox Should Really Be Worried About 

Ed Crowley – Chief Thought Leader, Virtulytix

Right now, there is a lot of discussion about what is going to happen with Xerox.  How will the new management team achieve their objectives of expanding margins, and growing EPS by at least 7% annually in order to return 50% of free cash flow back to shareholders as outlined in their Investor Day meeting? Will they survive the fight with Fuji? How much cost can they shave by cutting sales and marketing expenses?  I think this is missing the real point.

 

Currently a significant shift is underway that has been taking place for years, but which is really gaining steam.  And nobody really seems to be noticing as two key market leaders are ‘winning’ in the office. First, HP has now shipped over 1,500,000 inkjet printers based on the Page-Wide technology. That’s pretty amazing for a product that was just introduced a little over four years ago. And the installed base of Segment 3 CISS devices has already exceeded 3,500,000 with Epson owning 70% market share (details available in our 2018 hardware forecast).  By the way, these are not just low-cost CISS devices for the emerging markets. These are high performance MFPs designed to displace many existing office A3/A4 color MFPs.

 

We have been talking about the coming shift to inkjet in the office, but we were wrong. It’s not coming – it’s here. Or at least well underway! Fundamentally, inkjet has a cost/price performance over laser and this is manifesting itself in a clear cost performance advantage. So why should Xerox be scared (or Ricoh, or Konica?). Simply because they do not have this technology. Okay, Ricoh has the Geljet and Xerox has solid ink technology, but neither is really competitive against PageWide and Epson EcoTank products.

Today’s office market isn’t growing. It’s all about being able to make more profit from your piece of the pie (through increased efficiency and effectiveness) or getting a bigger share of the pie, which both HP and Epson are doing. This can only be coming at other’s expense. We made the argument several years ago that ‘laser bias’ in the office was dead. It is just taking some laser engine manufacturers longer than others to figure that out!

This is all a part of the increasing commoditization in the industry as we shift to the next era in the office imaging industry which we call the “Data Era”.  Learn more about this dynamic in our blog series and upcoming webinar on the future of Imaging.

The Future of Imaging... Part 1

From Typewriters to the Data Era .. The Beginning

 Ed Crowley, Virtulytix, Chief Thought Leader (CTO)

The world of imaging in the office is changing at an accelerating pace. In the five previous major eras of office imaging, we have seen each successive era become shorter in length, with increasing technology, economic, and social change. The pre-electronic era where typewriters ruled lasted 124 years. The last era, the services and commodity products era, lasted twelve. We propose that we are entering into the next era, the data era, where the focus shifts from technology, services, and products to using data to improve business processes, profits, and to ensure viability of the business. This new era has dramatic implications for who is at work, how they will work, and what their workplace is. Led by a new generation of workers, millennials, it will be fundamentally different than the eras which preceded it in almost every respect.

In this series of blogs, I will trace the history of office imaging and discuss the major technologies, user dynamics, and even economic drivers associated with each transition by breaking the industry’s evolution down into six distinct eras. This is not meant to be the definitive document about our industry, but rather a way to drive discussion and dialogue about where our industry is heading (and what it means for your business). As such, I encourage your comments, disagreement, acknowledgement, and/or debate – just be sure to engage.

At the conclusion of this series of five blogs (with 1 blog for each era), I will be holding a complimentary webinar on how to thrive in the new Data Era, as well as providing access to a free white paper covering is critical topic. So please join me on this journey and share your thoughts and opinions on our LinkedIn discussion group at Imaging Industry Transformation.

Office Imaging Infografic.png

Python or Java for ETL?

by Allison Zhang, Data Engineer, Virtulytix

In my prior blog, we discussed the differences between Python and R for data scientists. But before data scientists can analyze the data, there is one more important process, which is data ETL (Extract, Transform and Load).

 

As always, the answer is it depends.

 

From a novice’s view, Python is easier than Java. If this is your first time dive into the world of data or programming, Python can give you a quick introduction to key ETL concepts. Python is more readable than any other language. Its simple syntax is straightforward for everyone from experts to novice programmers. All you need to focus on is how to make the program produce your desired output. . This simplicity has made Python an extremely popular language within the business world and academia. A recent survey from the Association for Computing Machinery (ACM) found that Python has surpassed Java as the most popular language to introduce students to programming.

Once you understand basic programming concepts, it is time for you to move on to other more complex languages. This does not mean that Python is not an advanced programing language but, on the contrary, Python can achieve complex projects also. But more factors need to be considered at this stage.

Python is flexible. It is dynamically typed, which means Python performs type checking at runtime. This means that you do not need to declare the variable type when creating a variable . On the other hand, Java is statically typed, it has to declare the variables before the value can be assigned. So, Python is more flexible and can save time and space when running the scripts. But it might cause issues at runtime and it is slower than Java because of the type checking process. Java is strict. Strict typing makes it easier to provide autocompletion for Java. The compiler cab prevents you from mixing different kinds of data together. This is very helpful in the data engineering field. When a program consists of hundreds of files, it is too easy to get confused and make mistakes, and the more checks we have on our programs, the better off we are.

Screen Shot 2018-09-10 at 4.02.46 PM.png

When it comes to performance, Java is a better choice since it is more efficient. Java is more efficient when it comes to performance speed thanks to its optimizations and virtual machine execution. Just because of Python’s flexibility, performance is slowed down, which makes Java more attractive in this perspective. Java’s history in enterprise and its slightly more verbose coding style mean that Java legacy systems are typically larger and more numerous than Python’s. The “write once, run anywhere” design philosophy adopted by Java makes it unique in nature. In addition, it is extremely scalable making it the numero-uno choice for enterprise level development. And for large scale data, Java is always a better choice. It is faster and more efficient. For instance, Apache are written in Java.

After all, which language to choose depends on the scalability, performance, and purpose you want to achieve. For very large datasets, Java performs better than Python because of the factors discussed above. However, when performance is not that important, both languages are suitable for data engineers.

Is the ACLU right about facial recognition?

Facial recognition is back in the spotlight again thanks to a recent test of Amazon’s “Rekognition” product performed by the American Civil Liberties Union (ACLU).  Rekognition is a system, provided as a service by Amazon, used to identify a person or object based on an image or video (commonly known as ‘Facial Recognition’). This facial recognition technology is one of many new products to take advantage of new advancements in machine learning.

The goal of this test by the ACLU was to gauge the accuracy of the Rekognition product. To perform this test, a database was created from 25,000 publicly available arrest images. Headshots from all 535 members of congress where then compared, using the Rekognition service, against the arrest image database.  The result was that the Rekognition service produced 28 incorrect matches. Therefore, this means that according to Rekognition, 28 members of congress had also been arrested or incarcerated (which is not true).  Immediately following, the ACLU started sounding alarm bells regarding the accuracy of facial recognition and the ethical implications of its use by law enforcement. A 5% rate of error from any law enforcement tool is an understandable cause for concern. After all, these errors could add up to thousands of innocent American’s lives being ruined.

The problem is that while the ACLU’s concerns are valid, their testing of the Rekognition service is not. When using the Rekognition service there is a parameter for a confidence threshold. This threshold is the level of certainty that the Rekognition service must have for it to consider a match as being correct. If the threshold is set to 50%, the service must have 50% or more confidence that two pictures contain the same individual or object for it to be considered a match. During the ACLU’s testing, the required confidence interval was set to 80%. This is a major flaw in the ACLU’s testing methodology. An 80% confidence interval is normally used for recognizing basic objects, such as a chair or a basketball, not a human face. Amazon’s own recommendation to law enforcement is to use a 95% confidence interval when looking to identify individuals with a reasonable level of certainty.

The debate that the ACLU is trying to start regarding facial recognition and law enforcements is an important one. As new machine learning technologies are implemented by law enforcement, exercises such as these will be extremely important, but they must be done correctly. If the ACLU had used a 95% confidence interval, then this test would be a much better exercise for determining the validity of facial recognition technology as a law enforcement tool. Another major improvement that the ACLU could make to their exercise is to increase the sample size. Using the members of congress as an example does make for a compelling story, but a sample size of 535 individuals is still small. Overall the ACLU is demonizing a technology that it does not even know how to correctly use. This will only lower people’s confidence in the ACLU, not facial recognition technology.

Keeping a Close Eye on Your MPS program

Sales translate into profits for your company. However, in today’s cut throat competitive environment, is that enough? 

Manager Print Services (MPS) programs are set up to curb client costs while enabling smooth operation of the fleet and payed for based on usage. Virtulytix uses advanced analytics to deliver the same benefits to MPS dealers.

What are dashboards?

Dashboards are information visualization tools that can be used to monitor, analyze and help in decision making. There are 3 types of dashboards:

1)     Operational dashboards

2)     Strategic dashboards

3)     Analytical dashboards

In this blog we will analyze operational dashboards in the imaging industry with the help of examples.

An operational dashboard is used to monitor the day to day business. These dashboards are viewed anywhere from every minute to a couple of times a day. In terms of the imaging industry, dashboards would help:

1)     Supervisors to track service representatives out on duty

2)     Dealerships with manual toner fulfillment to make informed shipping decisions

3)     Warehouse manager to track demand and supply

What does Virtulytix bring to the table?

Aside from real-time dashboards to monitor the current transactions in the MPS environments, Virtulytix adds predictive and prescriptive insights to these dashboards to improve operational efficiency. Let us consider a couple of the above-mentioned use-cases.

1)     What if supervisors had knowledge of a fuser failure that would occur tomorrow at the same site where his service representative is fixing a paper jam today? Predictive analytics would help curb costs.

NR 7.26 Blog pic 1.png

This dashboard is intended to provide supervisors with a high-level view of the operations for the day. At a glance the supervisor is aware of the current load on each of his service representatives, the locations that they will cover, the types of incidents and the volume. Predicted failures close to the service sites are visualized at the bottom right hand corner along with the probability of it occurring within the next 5 days.

1)     In case of dealership with manual toner fulfillment, if the supplies department could know the cartridges predicted to run empty in the next "n" days and ship it to the clients without the clients having to raise requests.

 

NR 7.26 Blog Pic 2.png

This dashboard lists the cartridges predicted to run empty in the next 10 days and compares the estimated days to the days required for shipping. Cartridges which require immediate shipping are highlighted in red to avoid any further delay.

Each MPS dealer has different requirements, operational dashboards can be customized to help MPS dealers keep a close eye on their assets and increase operational efficiency. Analytical insights can be visualized on these dashboards to solve problems or meet certain objective. How to guide decision making with the help of strategic and analytical MPS dashboards will be covered in future blogs.

From Sensor to Solution: How to be Successful with Advanced Analytics

Advanced analytics is among the top buzzwords today due to the potential results this leading technology can bring. However, from statistics we’ve seen, efforts to develop and deploy these solutions fail as often as they succeed. Based on our experience, I’d like to review how we approach our customers and projects to give you some guidance to maximize your chance of success.

In any advanced analytics project, there are three main areas that have to be addressed. These are:

·      Business case

·      Solution architecture, technology and integration

·      Change management

For the first installment in this series, I will focus on business case. This should be the start of any new advanced analytics endeavor. One would think that all companies engage in an advanced analytics project with a clear business purpose in mind, but you’d be wrong. Too often, these projects are started with little focus leading to minimal results. With our customers, we begin with what we call a use case workshop. This typically involves a nominal fee to ensure the client has skin in the game. This workshop is designed to achieve multiple essential results including:

·      Identifying key pain points in the customer’s business.

·      Identifying what data the customer has to support the project and what gaps exist

·      Identifying key client stakeholders and decision makers

·      Determining key use cases for advanced analytics

·      Assessing the financial impact of each use case

·      Developing a decision matrix to determine which use case to focus on first

·      Developing success criteria for the project

·      Determine client’s comfort level with and ability to change

When we conduct these workshops, we work to ensure all of the major stake holders and decision makers are in the room so that consensus can be built with regard to the overall objective as well as the success criteria. It’s very important at an early stage to identify possible objections and work to overcome those. Additionally, this is a key opportunity to ensure that everyone is speaking the same language with regard to the solutions and technology. It’s also an opportunity to feel out the client’s ability to cope with significant process change. Value will not be delivered to the client unless they are able to leverage the solution created. If they are unwilling or unable to augment their processes to account for the new insights delivered by the solution, they will never see the value and the project as a whole will be unsuccessful.

Once we’ve explored the pain points and identified a few use cases, typically 3-5, we then do a financial analysis in collaboration with the customer. We use the customer’s costs as well as the expected outcome to identify what net benefit the customer can expect. We then use this analysis and weigh it against other variables such as feasibility of success, time to market, additional data sourcing and availability, integration difficultly (both technical and process focused), etc. to determine the best path forward. From there, we build out the proposal. This will include the key discoveries from the workshop including: project objective, success criteria, schedule, roles and responsibilities, etc. along with pricing and methodology. The key benefit to this methodology is that you have had to the opportunity to uncover what’s most important to the client and build consensus with the key stake holders prior to presenting a proposal. This prep work takes time, but it allows you to have a clear path on what is to be achieved, what the challenges are, and what the path for success looks like. Having this in place before you start developing models or cleaning data places you on the right foot to maximize your chance for success. In the next installment will discuss how to architect these solutions and the key pitfalls to look out for. Stay tuned!