What does intelligence taste like? Raspberries, of course

by Eric Crowley, Data Engineer

As technology improves, more and more devices are being built ‘smart’. Refrigerators are receiving weekly software updates, ovens are sending text message alerts, and even garbage cans require a Wi-Fi connection. Thanks to the internet of things (IoT), the newest generation of devices are not just efficient but also intelligent. As these smart devices start to become more common in the home, the dumb devices of yesterday will be considered obsolete. This will lead to many households debating the merit of replacing their dumb devices. After all, if an appliance does not ask you for your Wi-Fi password, is it even worth keeping?

The answer is yes! Any device or appliance can be made intelligent, all you need to do is give it a brain. In the world of IoT one of the best ‘brains’ out there is the small low-cost computer known as the Raspberry Pi.

  Figure 1

Figure 1

The Raspberry Pi (seen above in Figure 1) has a few key attributes that make it one of our favorite devices for making things smart. The first attribute is price. The Raspberry Pi is cheap. How cheap? Well for 35 dollars you get a 1.2GHz 64-bit quad-core ARM CPU with 1GB of Ram, built in Wi-Fi and Bluetooth connectivity, four USB ports, full size HDMI port, 40 GPIO pins, 3.5mm audio jack, and a micro SD card slot. That’s a lot of computing power for less than 50 dollars! The second thing that makes the Raspberry Pi stand out is its size; it’s tiny. At 4.8 x 3 x 1.3 inches, the device is small enough that it can be attached to just about anything. This small footprint allows the Pi to be used in a plethora of IoT applications since it can be attached to other appliances/devices without taking up a large footprint. The last, but definitely not least, key attribute of the Pi is its 40 general purpose input output (GPIO) pins. These pins, which are highlighted in Figure 2, allow for the Raspberry Pi to interact with the outside world through various sensors and components. If there is something in the environment that you would like to measure such as temperature, humidity, light, sound, vibration, etc., then there is probably a sensor out there that can measure it. Cost, size, and the ability to interact with the world are the key attributes that make the Raspberry Pi one of the best tools out there for adding intelligence to our dumb things.

  Figure 2

Figure 2

Almost anything can be made into a smart device using this incredible tool. Instead of taking my word for it, lets look at a practical example. One dumb thing that we have in the Virtulytix office is a space heater. This space heater, as seen in Figure 3, has two knobs on top of it. One of the knobs controls fan speed and the other controls the maximum heater temperature.

  Figure 3

Figure 3

The heater will keep heating until the maximum temperature is reached. Now you might have noticed from the picture that there is no actual temperature on the heater, but instead there is a bar that gradually increases in width. Wouldn’t it be nice if the heater was smart enough to know what we wanted the room temperature to be and then adjusted its output accordingly? Thanks to the Raspberry Pi, a few sensors, and a smart plug (all shown in Figure 4) it can be!

  Figure 4

Figure 4

The first thing our smart heater will need to know is the temperature of the room that it is in. The best way to acquire this information is using a sensor. In this case, the sensor used is a temperature and humidity sensor as seen in Figure 5.

  Figure 5

Figure 5

To attach the temperature sensor to the Raspberry Pi we will use the GPIO pins that we mentioned earlier. The sensor can be connected directly to the GPIO pins, but to make things easier we will be using a breakout board and a bread board.  A breakout board is what connects our GPIO pins to our bread board. A bread board is a base that allows us to create circuits without needing to solder anything. From a functionality standpoint, there is no difference between the data produced by a sensor connected directly to the GPIO pins and one connected to a bread board, but the bread board does add a nice base for the electronic components and makes things easier to keep organized.

  Figure 6

Figure 6

Now that our temperature sensor is attached (as seen in Figure 6), our Pi has the hardware necessary to know what the room temperature is. However, just knowing the current temperature is not enough. The Pi will need the ability to turn off the heater when the room becomes too hot or turn on the heater when the room becomes too cold. In order to accomplish this we will use a radio controlled outlet and a 433MHz Radio Frequency (RF) transmitter. The wireless outlet comes with a remote. Using a 433MHz RF receiver we can figure out what signal the remote sends to the outlet and then have the RF transmitter connected to our Pi replicate the signal.

  Figure 7

Figure 7

With both our temperature sensor and our RF transmitter attached (see Figure 7), our Pi now has all the hardware it needs to control the temperature in the room. For this project I decided to add two additional pieces of hardware to our Raspberry Pi. The first is an LED. We will use the LED as an indicator that the Pi is recording the current temperature. The second piece of hardware is a small LCD screen. The LCD screen allows for our Pi to display information relevant to the current temperature in the room.

  Figure 8

Figure 8

The Pi now has all the hardware (see Figure 8) it needs to control our dumb heater. Hardware cannot function on its own though, so we will need to write some software that will tell the Pi how to interact with the various sensors we have attached. Since the Raspberry Pi is running a full Linux OS we have many programming languages available to use. For this project I decided to use Python. Python is an excellent programing language to use with the Raspberry Pi due to both the extensive amount of support libraries available and its large community of active users. This helps to speed up development time since we do not have to reinvent the wheel when writing programs for the Raspberry Pi. Another software component that we will use is a relational database. Since the Pi has limited computational power, I decided to use SQLite. SQLite’s minimal footprint and Python libraries make it an ideal database for use on the Raspberry Pi when writing Python scripts.

Our smart heater will use two Python scripts. The purpose of the first script is to provide a web page where users can input their data. See Figure 9 to get an idea of what the simple web page looks like. This script will use a web framework called Flask. Flask is simple and lightweight microframework, which makes it perfect for powering our temperature input web page. The temperature input page is written in HTML and can be seen below.

  Figure 9

Figure 9

Now that our users have a way to input their temperature preference we can add our second Python script. Our second script will be designed to run once a minute and it will use the RPi.GPIO library. This library allows us to easily interact with the sensors connected to our GPIO pins. The logic for the script is fairly simple; the script will first query the database for the most recent recorded temperature. The temperature sensor will then get the current room temperature. If the current temperature is not the same as the last recorded temperature, the current temperature will be inserted into the DB. Next, the current temperature will be compared to the user provided temperature setting. If the current temperature is less than the preferred temperature, then the Pi will send the ‘ON’ signal to our wirelessly controlled outlet using the RF transmitter. If the current temperature is greater then the preferred temperature, the Pi will send the ‘OFF’ signal to our outlet. One thing to note, if the heater is already on and the outlet receives an ‘ON’ signal, nothing happens. The same thing occurs with the off signal. The last thing our script will do is output the current temperature to the small LCD screen attached to the Pi.

After creating both the scripts we are now able to deploy our smart heater! This smart heater can be used anywhere where an outlet and Wi-Fi connection are available. See Figure 10 to view the complete smart heater setup.

figure 10.png

While it sounds complex, the overall process of converting a heater into a smart heater is relatively simple. This same logic can be used to turn any device into a smart device using the Raspberry Pi. In our current environment more and more of the things we use are becoming smart. Thanks to the Raspberry Pi, we can make any device an intelligent device. All it takes is some sensors and a few lines of code!

CRISP-DM the Scrum Agile way. Why not!

by Nameeta Raj

Do you often find yourself in the middle of an infinite data preparation, modeling and testing loop? How about utilizing the rapid-delivery agile software development methodology for your analytics projects?

 Figure 1: Phases of CRISP DM

Figure 1: Phases of CRISP DM

What is CRISP-DM?

The cross-industry standard process for data mining (CRISP-DM) is a framework used for creating and deploying machine learning solutions. The process involves the phases as shown in Figure 1.

There have been times when I found myself stuck in between a never-ending data preparation, modeling and testing phase, which has left me pondering around the minimum viable product concept of scrum agile.

What is Agile and What is Scrum?

Agile is an iterative software development methodology intended to reduce the time to market (time it takes from a product being conceived until its being available for sale). Scrum is one of many frameworks that can be used to implement agile development. In scrum agile, development is done in sprint cycles, and at the end of each sprint a minimum viable product is deployed. Typically, a sprint ranges anywhere from 1 to 4 weeks.

Extending agile software development approach to analytics projects.


 Figure 2: CRISP-DM the scrum agile way

Figure 2: CRISP-DM the scrum agile way

Let us see how the merger can be accomplished. Any new requirement is prioritized and added to the product backlog by the product owner. The typical time-bound scrum meetings that are conducted are listed below;

Product Backlog Refinement Meeting:

The meeting should take place a few days before the start of a new sprint. The aim of the meeting is to understand the basic business, analyze cost benefit, and check the data scope. Initial estimation, finalization of the definition of ready and acceptance criteria are included in the meeting agenda. Business success criteria and data accessibility are some of the factors that can constitute towards the definition of ready.

Spring Planning Meeting:

The meeting should take place right before the start of a new sprint. By the end of this meeting, the team members have a thorough understanding of the requirement, which would cover a substantial portion of the business understanding phase of CRISP-DM. Re-estimation of items in the product backlog is done if required. The few days lag between the backlog refinement meeting and the sprint planning meeting is to ensure that all related activities required to meet the definition of ready has been completed. The acceptance criteria are finalized, the first sprint with a new requirement will aim at creating a minimum model fit to be demonstrated at the end of the sprint. Each consequent sprint will include further data preparation, data cleansing, and model enhancement activities. Taking the teams past velocity into consideration finalized requirements from the top of the product backlog are moved into the sprint backlog. The team is now committed to deliver the items on the sprint backlog and is ready to step into the next sprint. 

Daily Scrum Meeting:

The 15-minute daily standup meeting is conducted to answer three main questions. What work was completed the previous day? What is the work planned for the day? Are there any issues obstructing progress?

Sprint Review / Customer Review / Demo meeting:

The meeting is scheduled on the last day of the sprint. During this meeting the work committed by the team is compared to the work delivered. A brief demo of the completed work is done during this meeting. An overview of the data engineering activities along with the model created can be demonstrated to obtain feedback and new ideas from the team and stakeholders. These ideas can be implemented to improve the data engineering / modeling process in upcoming sprints. Any potential flaw in business understanding or irrelevant hypothesis testing can also be caught very early on during the demo session.

Sprint Retrospect Meeting:

The good, the bad, and the ugly of the completed sprint are discussed in this meeting.


I see a few probable advantages of using the scrum agile methodology. Those advantages include all stakeholders being well informed of the project progress right from the beginning. Potential never-ending modeling cycles can be eliminated, thus saving time. The sprint demo facilitates healthy team discussions and sharing of ideas. Technical bugs or mistakes in understanding the requirements can be detected very early during the lifecycle.






Will the Real Predictive Analytics Please Stand Up

by Scott Hornbuckle and Nameeta Raj

As an entrepreneur that focuses on utilizing leading edge technology to improve my clients’ businesses, I am often faced with people and companies using buzzwords carelessly, with little to no substance behind their claims. Predictive analytics along with big data, IoT, etc. are all the rage, but what is real, and what is just marketing fluff?

Let’s take predictive analytics as an example. Wikipedia defines predictive analytics as:

“Predictive analytics encompasses a variety of statistical techniques from predictive modeling, machine learning, and data mining that analyze current and historical facts to make predictions about future or otherwise unknown events.”

By this definition, predictive analytics would essentially be the utilization of a combination of statistical algorithms combined with machine learning and data mining used to predict a future event based on patterns discovered in historical data. Here’s the key, in order for a solution to be considered predictive analytics, it must include all of these components. Frequently when meeting with prospective clients, we are told that they are already using predictive analytics. When we probe a bit deeper, we discover that the client has created a spreadsheet that uses a simple linear regression equation, or they are using the linear algorithm included in a SQL database. While this is all fine and good, that’s just statistics, not predictive analytics.

Let’s go over an example from the office products industry. We developed a solution called SuppliesIQ to help printer/copier dealers reduce the cost of wasted toner from cartridges being changed out before they are empty. SuppliesIQ makes use of a time series modelling technique to ensure just in time (JiT) delivery of cartridges. SuppliesIQ is highly dynamic and chooses the best fitting model from a wide range of seasonal and non-seasonal time series models, not for each device, but for each cartridge within the device. An autoregressive integrated moving average (ARIMA) model forms the base model for SuppliesIQ. The models are created with the help of IBMs Predictive Maintenance and Quality platform which enables switching between ARIMA and exponential smoothing models to find the best fitting model for the toner cartridge. Historical data permits the model to identify quarterly monthly and weekly seasonality and adjust the predictions accordingly.

The graph below shows SuppliesIQ in comparison to a basic linear regression model present in the market. The orange line represents the actual toner levels; the blue line represents the predicted toner levels by SuppliesIQ, and the green line represents the estimated empty date according to the linear algorithm. The SuppliesIQ model accurately captures the straightforward weekly seasonality and the graph is relatively flat on weekends.

  Figure 1: SuppliesIQ vs Linear Regression Prediction

Figure 1: SuppliesIQ vs Linear Regression Prediction

This cartridge ran empty on 11/14/2017. The linear regression model predicted it 6 days after the cartridge ran empty whereas the SuppliesIQ model predicted it a day after. Due to the short cycle length of the latest cycle, this linear regression model could not completely adapt to the increased printing behavior.

  Figure 2: Six-month Toner Cycles

Figure 2: Six-month Toner Cycles

So, what are the key things to look for when determining whether or not a solution truly uses predictive analytics? Here are three key thinks to look for:

1.     More than just statistics: Advanced statistics are a key component of any predictive analytics solution. However, if one is simply using an out of the box linear algorithm from a tool like Excel, SQL, etc., I wouldn’t consider this to be predictive analytics. It can make fairly rudimentary predictions, but these are not the same.

2.     It’s dynamic and adjusts to changes in environment: This is one of the key components that separates true predictive analytics from the posers. Business environments are continuously in flux. This is due to business cycles, seasonality, scaling up/down, etc. An example of this is a school. If a printer is low on toner, when should the cartridge be shipped? Well, this depends on the context of the device. If it’s May, and the school is getting ready to dismiss students for break, the toner cartridge may be able to last until the start of the next term. A static model wouldn’t take this usage change into account. True predictive analytics looks at how each cartridge in each device is used and adapts to the user behavior. We will explore this topic further in a future blog.

3.     The model gets better over time: Machine learning is a key component in predictive analytics. Using a machine learning enables the model to improve over time automatically. Static regression models must be updated manually and applied broadly. This quickly shows the benefit of true predictive analytics. Predictive analytics takes into account the devices history, the accuracy or predictions made in the past, and adjusts accordingly. This dynamic improvement is essential in the rapidly evolving business environment we all work in.

In conclusion, there are a lot of companies claiming to offer predictive analytics. The technology is powerful and can enable companies to dramatically improve their businesses and evolve business models. However, the technology is complex, and the skill sets required to use the technology is in short supply. When you are looking to employ this technology, use the tips above to separate the real from the rest.

Virtulytix launches SuppliesIQ to address a $1Billion a year problem for imaging device dealers and resellers

Virtulytix, an Industrial IoT development firm based in Lexington Kentucky has announced a Software as a Solution (SaaS) based product to address a long-standing problem for the office products industry. Within the office products industry the loss of toner due to early cartridge replacement has been a not so well kept secret which costs fleet managers (dealers, resellers and OEMs) over $50 device* per year. For an individual dealer who manages thousands of devices, this cost quickly adds up. This problem costs the industry as a whole over $750 million each year in cost of wasted toner and additional shipping.

According to Scott Hornbuckle, President of Virtulytix, “We are utilizing predictive and prescriptive analytics tools to develop a ‘pay for use’ solution for dealers to determine when the cartridge will be empty, and thus avoid early replacement of cartridges. Our field testing has shown that we can reduce this waste by up to 80%, resulting in a significant savings for dealers!”

Virtulytix’s solution captures data from the fleet management software the dealer is currently using, processes the data in a secure private cloud, and then provides the outputs directly to the dealer’s ERP system, or in the form of dash boards or secure notifications. The cost savings associated with this model are significant, even when compared to existing toner saving approaches such as linear regression algorithms, in fact it can be twice as large!


The first implementation of this tool was developed for use with the PrintFleet™ fleet management software, additional connectors can be provided for other major fleet management tools. 

In order to make it easy for dealers to receive the benefit of SuppliesIQ, Virtulytix is providing a one month free trial. If the dealer is satisfied with the results, they can sign up for an annual subscription. In order to help dealers understand how much this tool can save them, Virtulytix has an Impact Calculator tool which is available at http://www.virtulytix.com/impact.

According to Ed Crowley, CEO of Virtulytix, “The office products equipment market is one of the largest existing industrial IoT markets with over 100 million, digitally enabled, connected devices installed worldwide. Over 15 million of these devices are actively managed under some type of pay for use or cost per page service agreement. By introducing this solution we are providing our clients with a way to significantly improve the profitability and efficiency of their fleet management operations.”

The SuppliesIQ solution and free trial are available now to dealers, resellers, and OEMS. For more information please contact Mario Diaz at mario.diaz@virtulytix.com or +1 602.571.6530.

About Virtulytix
Virtulytix provides advanced analytics solutions which enable industrial IoT device manufacturers and fleet managers to realize the full potential of their IoT enabled devices and fleets.

Virtulytix develops solutions using IBM Watson Cognitive tools, Tableau, Python, and other advanced analytics technologies. The firm’s team of project managers, data engineers, and data scientists help clients reduce service costs, minimize replenishable item wastage, and avoid unplanned maintenance for industrial IoT enabled products by developing predictive and prescriptive analytics based solutions.  The company’s clients include energy companies, semiconductor firms, and office equipment manufacturers. Virtulytix also provides market intelligence and consulting to the office equipment industry.

Web Page: www.virtulytix.com
Email: info@virtulytix.com
269 West Main Street, Suite 400
Lexington, Kentucky 40507

*For simplification purposes, costs are based on monochrome printing only, color printing costs are significantly higher. This is based on typical usage for printers, copiers, and MFPs in managed fleets and has been validated using empirical testing of the remaining toner in ‘empty’ cartridges which were captured as part of managed fleet engagements.

PrintFleet is a trademark of PrintFleet Inc.

Media Contact:  Scott Hornbuckle
President, Virtulytix, Inc.
Phone: 502-664-0733
Email:  scott.hornbuckle@virtulytix.com

Why Analytics Projects Fail – And It’s Not The Analytics!

Being in a highly technical, complex field it is easy to sometimes lose the ‘human aspect’ of the solutions we are developing. We focus on apply edge computing concepts, or whether a seasonality model works better for our predictive accuracy than some other approach. Don't get me wrong, these are all important activities. However, in working with many firms in developing, deploying and supporting advanced analytics solutions, particularly in the domain of the Industrial IoT space, it’s often the people side that fails – not the technology. Read More . . .

We make your Industrial IoT fleet smart!

For 11 years, Photizo Group has been at the forefront of helping clients transform their business models to maximize their market growth in the rapidly evolving and dynamic imaging market. The firm advocated that to survive and thrive, imaging firms need to transform their business models to new service centric models. Photizo Group established a leadership position by being at the forefront of MPS.

During the last two years, Photizo Group invested heavily in their own transformation through the development of a Predictive Analytics division led by Scott Hornbuckle. Today we are announcing that Photizo Group is completing its transformation by being acquired by Virtulytix - an advanced analytics firm based in Lexington, Kentucky. Virtulytix provides both “on premise” and “as a Service” advanced analytics solutions which optimize industrial Internet of Things (IoT) enabled fleets by utilizing predictive modeling and other advanced analytics technologies to reduce waste in supply items and logistics chains, reduce service and warranty cost through highly accurate advanced failure prediction, and optimize revenue by maximizing fleet uptime via reducing unplanned service and maintenance events. Virtulytix tools and solutions are currently deployed in the Semiconductor, Imaging, and Nuclear Power industries.

Virtulytix enables clients to optimize their IoT enabled fleets by applying a full range of solutions design and development capabilities including needs analysis and use case development, solutions design and development, deployment services including integrating into production systems. 

Through this acquisition, Virtulytix is able to leverage the significant investment and intellectual property development that Photizo Group Inc. made into predictive analytics including the patent pending model for dramatically reducing wasted toner due to early replacement in managed imaging fleets. The management team of Virtulytix includes the industry veterans from Photizo Group including Scott Hornbuckle (President), Mario Diaz (VP Consulting), and Ron Iversen (VP Market Intelligence Services).

Virtulytix will continue to support Photizo Group’s imaging industry clients by providing consulting and market intelligence services specifically for the imaging industry. The services provided by Photizo Group will continue under Virtulytix including:

  • Market Intelligence Services, led by Ron Iversen, including the Office Supplies Advisory Service, MPS Advisory Service, Wide Format Forecasts, Ink in the Office Advisory Service, and the Navigator Advisory Service. Virtulytix will continue the model of wrapping high value advisory services including briefings, inquiries, and forecasts, around critical topic areas for the imaging industry; and,
  • Consulting, led by Mario Diaz, addressing strategic and tactical issues relevant to our clients including traditional subjects such as developing and implementing market penetration and growth plans, strategy development and reviews, new product strategy and development and channel strategies and development. Virtulytix will continue to provide a personalized ‘boutique’ approach to projects we are engaged in around the globe. In addition, Virtulytix brings strong data science and data engineering skills to solving our client’s big data and industrial IoT challenges.

Our goal is to make this transition as seamless as possible for Photizo Group clients. Clients who are in the midst of a project will not see any impact and will continue to work with the same project team. Please contact your Client Executive with any questions regarding your specific project. 

For questions regarding this acquisition, please feel free to contact Scott Hornbuckle at Virtulytix Inc. at either scott.hornbuckle@virtulytix.com or +1 (502) 664-0733. This is really an exciting chapter in our working relationship with you and I look forward to our team continuing to serve your needs.


Scott Hornbuckle
President, Virtulytix, Inc.

Industrial IoT Solutions Firm, Virtulytix Launches From Lexington, Kentucky

New company with over 120 years of applied industry experience

Virtulytix is a new company focused on providing advanced analytics to clients across a broad range of industries. According to the company’s President, Scott Hornbuckle, Virtulytix is on the leading edge of advanced analytics with the specific goal of “making the Industrial IoT smart!”. While the company is new, it is leveraging the experience of a group of seasoned industry executives including:

Scott Hornbuckle, President, built the Advanced Analytics team at Photizo Group as the group’s director. Scott has a unique combination of advanced analytics expertise, customer understanding, technical expertise, and business management skills which make him the logical choice to be ‘face’ of Virtulytix. 

Mario Diaz, VP Consulting, another 30+ year veteran of the technology industry with key management roles at QMS, Toshiba, Apple Computer, Avnet, and IO Datacenters. Mario leads the consulting practice for Virtulytix where he applies his deep insight in customer experience and marketing to helping clients address their specific transformational and management challenges.

Ron Iversen, VP Market Intelligence Services, rounds out the management team, another 30+ year executive, brings exceptionally strong depth in the imaging industry in addition to deep expertise in product strategy and product development. 

Virtulytix has acquired the assets of Photizo Group including their intellectual property and current client base. In addition to the Photizo Group alumni, a number of new faces have been added to the analytical bench including Nameeta Raj and Junying (Allison) Zhang:

Nameeta Raj graduated from the University of Cincinnati with a Master of Science in Information Systems with a Data Analysis Certification in SQL, SAS, IBM DB2, Tableau, SPSS, Python, R, and Microsoft Excel. She has a rich background in Data Analytics and work experience with ABN AMRO bank in India. Besides her certifications, she has strengths in statistical concepts like multi-variate regression, decision trees, and clustering. 

Allison Zhang recently graduated from the University of Cincinnati, Carl H. Lindner College of Business with a Master of Science in Information Systems. Along with her Master of Science in Accounting and software skills in SAP, SPSS, SAS, SQL and Tableau, Allison will be a specialist in developing advanced analytics solutions.

Allison and Nameeta both are joining an existing team of project management and data engineering professionals who have been successful in building industrial IoT solutions using a variety of platforms and technologies including Mike Huster (Project Management Director) and Eric Crowley (Sr. Data Engineer).

Virtulytix serves clients in the Semiconductor, Nuclear Energy, Imaging, and other industrial markets which are becoming IoT enabled. In addition to developing predictive analytics solutions (both cloud based and on premise), Virtulytix was one of the first firms to develop an ‘as a service’ offering for accurately predicting Industrial IoT enabled device needs using the IBM Watson Cognitive platform called Predictive Maintenance and Quality (PMQ) in partnership with SIS. In addition to Advanced Analytics, the firm provides market data, consulting, and other analytical intelligence services.

For more information, contact Scott Hornbuckle: scott.hornbuckle@virtulytix.com or (502) 664-0733