The Future of Imaging.... Part 2

From Typewriters to the Data Era .. The Typewriter Era

Ed Crowley, Virtulytix, Chief Thought Leader (CTO) 

Have you ever really thought about how disruptive typewriters were. Can you imagine a time when clerks literarily hand wrote letters and kept handwritten entries in accounting journals? And this isn’t the dark ages we are talking about. Right up until the 1860’s, this was the norm.

The scribe era was dominated by a relatively small educated class that could read and write. Illiteracy rates were as high as 90% in the 1500’s falling to 60-70% in the early 1700’s. It wasn’t until the late 1700’s that literacy rates began increasingly significantly as education rates improved, and the industrial era began. During that time society began the first major shift away from a trades-based and agrarian society to the industrial society we know today. These socio-economic changes set the stage for the dramatic technology driven shift that we call the typewriter era.

Christopher Lathem Sholes introduced the first commercially successful typewriter in1867 Suddenly, the legibility of the letter wasn’t dependent upon penmanship. Every letter was consistent in terms of quality, despite the individual skill set of the operator. What a huge change.

At almost the same time, the peak of industrialization and the rise of education levels drove an increasing need for, and availability of, clerical workers along with standardized communication. The typewriter era brought a way for these clerical workers to create business correspondence in a standardized way faster and more accurately than ever before. The beginnings of the modern office were born.

Sholes Typewriter

Sholes Typewriter

In the early 1960’s IBM further revolutionized the office environment by introducing the electric typewriter which included such innovations as the qwerty style keyboard and proportional spacing.  By 1983, typewriters had enjoyed a long and successful run of over 120 years and IBM was the leader in the market with the number one market share position in the USA. Dataquest (a leading market forecast provider of the time) was predicting the typewriter market would more than double by 1987. This forecast was made despite the emerging PC market and the emergence of low-cost dot-matrix printers and ultimately was proven wrong. By 1987 the typewriter market was in a free-fall due to the increasing use of computer like Word Processors from companies like Wang and Lanier along with growing use of desktop personal computers (PCs) from IBM and Compaq. By 1991 IBM would divest itself of its typewriter division in a spin-off that would ultimately become Lexmark (which was acquired 26 years later by the Chinese firm, Apex/Ninestar).

The typewriter era had an incredible run. It created and then changed the modern office. However, it was ultimately displaced by new, emerging technologies. And no one saw it coming. This was the end of what we call the first modern office era - the Typewriter era. It lasted 124 years compared to over 300 years for the scribe era.

In this transition we see a foreshadowing of the increasing rate of change and disruption that will happen over the next thirty five years enabled by technology advances and changing socio economic trends. In the next blog I will walk us through the ‘Impact Era’ which overlapped with the typewriter era, but had a relatively short peak as the market rapidly shifted to page printers.

At the conclusion of this series of five blogs (each blog will cover a distinct era), I will be holding a complimentary webinar on how to thrive in the new Data Era, as well as provide access to a free white paper covering is critical topic. I encourage you to share your thoughts and opinions on the industry’s journey at on this blog, on the Virtulytix LinkedIn site or the LinkedIn discussion group at Imaging Industry Transformation.

What Xerox Should Really Be Worried About

What Xerox Should Really Be Worried About 

Ed Crowley – Chief Thought Leader, Virtulytix

Right now, there is a lot of discussion about what is going to happen with Xerox.  How will the new management team achieve their objectives of expanding margins, and growing EPS by at least 7% annually in order to return 50% of free cash flow back to shareholders as outlined in their Investor Day meeting? Will they survive the fight with Fuji? How much cost can they shave by cutting sales and marketing expenses?  I think this is missing the real point.

 

Currently a significant shift is underway that has been taking place for years, but which is really gaining steam.  And nobody really seems to be noticing as two key market leaders are ‘winning’ in the office. First, HP has now shipped over 1,500,000 inkjet printers based on the Page-Wide technology. That’s pretty amazing for a product that was just introduced a little over four years ago. And the installed base of Segment 3 CISS devices has already exceeded 3,500,000 with Epson owning 70% market share (details available in our 2018 hardware forecast).  By the way, these are not just low-cost CISS devices for the emerging markets. These are high performance MFPs designed to displace many existing office A3/A4 color MFPs.

 

We have been talking about the coming shift to inkjet in the office, but we were wrong. It’s not coming – it’s here. Or at least well underway! Fundamentally, inkjet has a cost/price performance over laser and this is manifesting itself in a clear cost performance advantage. So why should Xerox be scared (or Ricoh, or Konica?). Simply because they do not have this technology. Okay, Ricoh has the Geljet and Xerox has solid ink technology, but neither is really competitive against PageWide and Epson EcoTank products.

Today’s office market isn’t growing. It’s all about being able to make more profit from your piece of the pie (through increased efficiency and effectiveness) or getting a bigger share of the pie, which both HP and Epson are doing. This can only be coming at other’s expense. We made the argument several years ago that ‘laser bias’ in the office was dead. It is just taking some laser engine manufacturers longer than others to figure that out!

This is all a part of the increasing commoditization in the industry as we shift to the next era in the office imaging industry which we call the “Data Era”.  Learn more about this dynamic in our blog series and upcoming webinar on the future of Imaging.

The Future of Imaging... Part 1

From Typewriters to the Data Era .. The Beginning

 Ed Crowley, Virtulytix, Chief Thought Leader (CTO)

The world of imaging in the office is changing at an accelerating pace. In the five previous major eras of office imaging, we have seen each successive era become shorter in length, with increasing technology, economic, and social change. The pre-electronic era where typewriters ruled lasted 124 years. The last era, the services and commodity products era, lasted twelve. We propose that we are entering into the next era, the data era, where the focus shifts from technology, services, and products to using data to improve business processes, profits, and to ensure viability of the business. This new era has dramatic implications for who is at work, how they will work, and what their workplace is. Led by a new generation of workers, millennials, it will be fundamentally different than the eras which preceded it in almost every respect.

In this series of blogs, I will trace the history of office imaging and discuss the major technologies, user dynamics, and even economic drivers associated with each transition by breaking the industry’s evolution down into six distinct eras. This is not meant to be the definitive document about our industry, but rather a way to drive discussion and dialogue about where our industry is heading (and what it means for your business). As such, I encourage your comments, disagreement, acknowledgement, and/or debate – just be sure to engage.

At the conclusion of this series of five blogs (with 1 blog for each era), I will be holding a complimentary webinar on how to thrive in the new Data Era, as well as providing access to a free white paper covering is critical topic. So please join me on this journey and share your thoughts and opinions on our LinkedIn discussion group at Imaging Industry Transformation.

Office Imaging Infografic.png

Python or Java for ETL?

by Allison Zhang, Data Engineer, Virtulytix

In my prior blog, we discussed the differences between Python and R for data scientists. But before data scientists can analyze the data, there is one more important process, which is data ETL (Extract, Transform and Load).

 

As always, the answer is it depends.

 

From a novice’s view, Python is easier than Java. If this is your first time dive into the world of data or programming, Python can give you a quick introduction to key ETL concepts. Python is more readable than any other language. Its simple syntax is straightforward for everyone from experts to novice programmers. All you need to focus on is how to make the program produce your desired output. . This simplicity has made Python an extremely popular language within the business world and academia. A recent survey from the Association for Computing Machinery (ACM) found that Python has surpassed Java as the most popular language to introduce students to programming.

Once you understand basic programming concepts, it is time for you to move on to other more complex languages. This does not mean that Python is not an advanced programing language but, on the contrary, Python can achieve complex projects also. But more factors need to be considered at this stage.

Python is flexible. It is dynamically typed, which means Python performs type checking at runtime. This means that you do not need to declare the variable type when creating a variable . On the other hand, Java is statically typed, it has to declare the variables before the value can be assigned. So, Python is more flexible and can save time and space when running the scripts. But it might cause issues at runtime and it is slower than Java because of the type checking process. Java is strict. Strict typing makes it easier to provide autocompletion for Java. The compiler cab prevents you from mixing different kinds of data together. This is very helpful in the data engineering field. When a program consists of hundreds of files, it is too easy to get confused and make mistakes, and the more checks we have on our programs, the better off we are.

Screen Shot 2018-09-10 at 4.02.46 PM.png

When it comes to performance, Java is a better choice since it is more efficient. Java is more efficient when it comes to performance speed thanks to its optimizations and virtual machine execution. Just because of Python’s flexibility, performance is slowed down, which makes Java more attractive in this perspective. Java’s history in enterprise and its slightly more verbose coding style mean that Java legacy systems are typically larger and more numerous than Python’s. The “write once, run anywhere” design philosophy adopted by Java makes it unique in nature. In addition, it is extremely scalable making it the numero-uno choice for enterprise level development. And for large scale data, Java is always a better choice. It is faster and more efficient. For instance, Apache are written in Java.

After all, which language to choose depends on the scalability, performance, and purpose you want to achieve. For very large datasets, Java performs better than Python because of the factors discussed above. However, when performance is not that important, both languages are suitable for data engineers.

Is the ACLU right about facial recognition?

Facial recognition is back in the spotlight again thanks to a recent test of Amazon’s “Rekognition” product performed by the American Civil Liberties Union (ACLU).  Rekognition is a system, provided as a service by Amazon, used to identify a person or object based on an image or video (commonly known as ‘Facial Recognition’). This facial recognition technology is one of many new products to take advantage of new advancements in machine learning.

The goal of this test by the ACLU was to gauge the accuracy of the Rekognition product. To perform this test, a database was created from 25,000 publicly available arrest images. Headshots from all 535 members of congress where then compared, using the Rekognition service, against the arrest image database.  The result was that the Rekognition service produced 28 incorrect matches. Therefore, this means that according to Rekognition, 28 members of congress had also been arrested or incarcerated (which is not true).  Immediately following, the ACLU started sounding alarm bells regarding the accuracy of facial recognition and the ethical implications of its use by law enforcement. A 5% rate of error from any law enforcement tool is an understandable cause for concern. After all, these errors could add up to thousands of innocent American’s lives being ruined.

The problem is that while the ACLU’s concerns are valid, their testing of the Rekognition service is not. When using the Rekognition service there is a parameter for a confidence threshold. This threshold is the level of certainty that the Rekognition service must have for it to consider a match as being correct. If the threshold is set to 50%, the service must have 50% or more confidence that two pictures contain the same individual or object for it to be considered a match. During the ACLU’s testing, the required confidence interval was set to 80%. This is a major flaw in the ACLU’s testing methodology. An 80% confidence interval is normally used for recognizing basic objects, such as a chair or a basketball, not a human face. Amazon’s own recommendation to law enforcement is to use a 95% confidence interval when looking to identify individuals with a reasonable level of certainty.

The debate that the ACLU is trying to start regarding facial recognition and law enforcements is an important one. As new machine learning technologies are implemented by law enforcement, exercises such as these will be extremely important, but they must be done correctly. If the ACLU had used a 95% confidence interval, then this test would be a much better exercise for determining the validity of facial recognition technology as a law enforcement tool. Another major improvement that the ACLU could make to their exercise is to increase the sample size. Using the members of congress as an example does make for a compelling story, but a sample size of 535 individuals is still small. Overall the ACLU is demonizing a technology that it does not even know how to correctly use. This will only lower people’s confidence in the ACLU, not facial recognition technology.

Keeping a Close Eye on Your MPS program

Sales translate into profits for your company. However, in today’s cut throat competitive environment, is that enough? 

Manager Print Services (MPS) programs are set up to curb client costs while enabling smooth operation of the fleet and payed for based on usage. Virtulytix uses advanced analytics to deliver the same benefits to MPS dealers.

What are dashboards?

Dashboards are information visualization tools that can be used to monitor, analyze and help in decision making. There are 3 types of dashboards:

1)     Operational dashboards

2)     Strategic dashboards

3)     Analytical dashboards

In this blog we will analyze operational dashboards in the imaging industry with the help of examples.

An operational dashboard is used to monitor the day to day business. These dashboards are viewed anywhere from every minute to a couple of times a day. In terms of the imaging industry, dashboards would help:

1)     Supervisors to track service representatives out on duty

2)     Dealerships with manual toner fulfillment to make informed shipping decisions

3)     Warehouse manager to track demand and supply

What does Virtulytix bring to the table?

Aside from real-time dashboards to monitor the current transactions in the MPS environments, Virtulytix adds predictive and prescriptive insights to these dashboards to improve operational efficiency. Let us consider a couple of the above-mentioned use-cases.

1)     What if supervisors had knowledge of a fuser failure that would occur tomorrow at the same site where his service representative is fixing a paper jam today? Predictive analytics would help curb costs.

NR 7.26 Blog pic 1.png

This dashboard is intended to provide supervisors with a high-level view of the operations for the day. At a glance the supervisor is aware of the current load on each of his service representatives, the locations that they will cover, the types of incidents and the volume. Predicted failures close to the service sites are visualized at the bottom right hand corner along with the probability of it occurring within the next 5 days.

1)     In case of dealership with manual toner fulfillment, if the supplies department could know the cartridges predicted to run empty in the next "n" days and ship it to the clients without the clients having to raise requests.

 

NR 7.26 Blog Pic 2.png

This dashboard lists the cartridges predicted to run empty in the next 10 days and compares the estimated days to the days required for shipping. Cartridges which require immediate shipping are highlighted in red to avoid any further delay.

Each MPS dealer has different requirements, operational dashboards can be customized to help MPS dealers keep a close eye on their assets and increase operational efficiency. Analytical insights can be visualized on these dashboards to solve problems or meet certain objective. How to guide decision making with the help of strategic and analytical MPS dashboards will be covered in future blogs.

From Sensor to Solution: How to be Successful with Advanced Analytics

Advanced analytics is among the top buzzwords today due to the potential results this leading technology can bring. However, from statistics we’ve seen, efforts to develop and deploy these solutions fail as often as they succeed. Based on our experience, I’d like to review how we approach our customers and projects to give you some guidance to maximize your chance of success.

In any advanced analytics project, there are three main areas that have to be addressed. These are:

·      Business case

·      Solution architecture, technology and integration

·      Change management

For the first installment in this series, I will focus on business case. This should be the start of any new advanced analytics endeavor. One would think that all companies engage in an advanced analytics project with a clear business purpose in mind, but you’d be wrong. Too often, these projects are started with little focus leading to minimal results. With our customers, we begin with what we call a use case workshop. This typically involves a nominal fee to ensure the client has skin in the game. This workshop is designed to achieve multiple essential results including:

·      Identifying key pain points in the customer’s business.

·      Identifying what data the customer has to support the project and what gaps exist

·      Identifying key client stakeholders and decision makers

·      Determining key use cases for advanced analytics

·      Assessing the financial impact of each use case

·      Developing a decision matrix to determine which use case to focus on first

·      Developing success criteria for the project

·      Determine client’s comfort level with and ability to change

When we conduct these workshops, we work to ensure all of the major stake holders and decision makers are in the room so that consensus can be built with regard to the overall objective as well as the success criteria. It’s very important at an early stage to identify possible objections and work to overcome those. Additionally, this is a key opportunity to ensure that everyone is speaking the same language with regard to the solutions and technology. It’s also an opportunity to feel out the client’s ability to cope with significant process change. Value will not be delivered to the client unless they are able to leverage the solution created. If they are unwilling or unable to augment their processes to account for the new insights delivered by the solution, they will never see the value and the project as a whole will be unsuccessful.

Once we’ve explored the pain points and identified a few use cases, typically 3-5, we then do a financial analysis in collaboration with the customer. We use the customer’s costs as well as the expected outcome to identify what net benefit the customer can expect. We then use this analysis and weigh it against other variables such as feasibility of success, time to market, additional data sourcing and availability, integration difficultly (both technical and process focused), etc. to determine the best path forward. From there, we build out the proposal. This will include the key discoveries from the workshop including: project objective, success criteria, schedule, roles and responsibilities, etc. along with pricing and methodology. The key benefit to this methodology is that you have had to the opportunity to uncover what’s most important to the client and build consensus with the key stake holders prior to presenting a proposal. This prep work takes time, but it allows you to have a clear path on what is to be achieved, what the challenges are, and what the path for success looks like. Having this in place before you start developing models or cleaning data places you on the right foot to maximize your chance for success. In the next installment will discuss how to architect these solutions and the key pitfalls to look out for. Stay tuned!

Advanced Analytics in the Horse Capital of the World

by Allison Traynor, Project Manager, Virtulytix

Here, in the Horse Capital of the World, you can find a perfect combination of Advanced Analytics and Horse Racing. Horse Racing, the sport of Kings, has become more popular in recent years, attracting more fans in part due to the improving economy and the new efforts by the industry to reach out to up and coming fans. Those who are in this industry are those who love the sport and the magnificent horses who run the races. Recently at Keeneland Racetrack, there was a Welfare and Safety Summit for the horse racing industry on research and measures taken to ensure safety and integrity at the racetrack and additional common sense measures. One of the most interesting presentations was done by Tim Parkin, a professor of veterinary epidemiology who is conducting research with advanced analytics, Parkin is attempting to create and deploy a model that predicts with a high level of accuracy which horse for which race will suffer a non-fatal or fatal injury.

I was deeply surprised to have found the prevalence of predictive analytics within horseracing. Since 2009, fatal injuries have declined 20% during Thoroughbred races, however we still can do better. Using the Equine Injury Database (EID), from 2009 to 2017, there is data gathered from 3.1 million starts from nearly 150,000 horses. However, as he discussed they are missing data, to the point where they have nearly a quarter of their factors deemed undetermined. To deploy the model in a production setting it is important to identify what is missing. They are missing about 50% of non-fatal injuries in racing and will further need data on training injuries as well. Now within an industry that questions all possible scenarios, I question what else could take effect. Could it be the horse’s determination on race day, or the consistency of the dirt? Could it be their physical build? The physical appearance measurements of a horse are very important in how they race, the width and angle between their legs, the depth of their hindquarters, and the height of their withers. All these are important when a horse looks to run, not having those features may mean they are more prone to injuries.

Predictive analytics, and their capabilities are proving to within a short amount of time how invaluable the technology is. By using technology and predictive algorithms, individuals and organizations can analyze their systems with advanced capabilities. As Dr. Tim Parkin presented, there are endless use cases for advanced analytics. At Virtulytix, we are discovering and realizing that the applications of advanced analytics are basically incomparable, here in the Horse Capital of the World.

What does the future hold for office printer manufacturers?

Most of you know that the printing market is declining and will continue to decline as fewer and fewer printed pages are being replaced by electronic documents. This has an impact not only on new printer shipments but more importantly on the printer installed base which is what drives consumable and maintenance revenues. Why is this significant?   Well, because consumables represent about 65% of total print revenue and a higher percentage of profits to printer manufacturers and this is why printer manufacturers spend huge amounts of money in legal fees defending their printer cartridge patents.

Are You Operating At Only 85%?

Being a successful office products provider is getting tougher. Historically, if you did a reasonably good job of providing customer service, managed to maintain a steady stable of good sales people, and managed your expenses reasonably well - you could make really good money owning an office products dealership. Let's face it, all that printing and copying that was happening in offices provided you with a steady, annuity-based revenue stream. Life was good.