comment 0

Improving Access to Care through Decision Support Algorithms

ATTN: Business and company leaders whom aspire to excel,

Meet Freddie Weiss (B.S. industrial and systems engineering, Georgia Tech), Beth Garcia, Anita Ying, M.D., Laura Burke and Gwen Tate

MD Anderson Cancer Center

Improving Access to Care through Decision Support Algorithms

The University of Texas MD Anderson Cancer Center, based out of Houston, Texas, provides cancer patient care, research, education and prevention.  MD Anderson has been ranked the leading cancer hospital for 11 of the past 14 years by the U.S. News & World Report’s “Best Hospitals” survey.  The hospital provides over 1 million outpatient clinic visits, treatments and procedures each year.  In FY15, the hospital processed approximately 68,000 new patient referrals with 39,000 new patients seen.

The goal of this project was to improve the patient experience through ease of access and to reduce the time between initial contact and the scheduling of a new patient appointment where appropriate. Previously, an appointment could not be offered to a patient until medical records were received and reviewed by the clinical team to determine if it was medically appropriate for the patient to come to MD Anderson and if yes, which service and provider the patient should see.  This process could take anywhere from 3 to 15 days.  Data on cancellation rates showed that the top reasons for cancellation were “Unknown,” “Shopping for Options,” and “In Treatment Elsewhere.”  Although there were some inconsistencies with how the cancellations were labelled, these reasons pointed to the same issue.  By the time the staff member contacted the patient to schedule the appointment, many times the patient had already elected to seek treatment elsewhere.  Historically, the hospital had a 35%-40% cancellation rate.

The team used a systems engineering approach to evaluate the new patient process.  The team evaluated the process using a process map to identify key areas for improvement and focused on the desire to provide a patient with a new patient appointment during the initial call.  A pilot study was conducted in the Endocrine Center to establish proof of concept.  As the access staff were not clinical personnel, an algorithm (or decision tree) was needed to aid the staff in determining when an appointment could be given without additional medical review and with which service/provider the patient should be scheduled.  The team then worked with the provider team to document the clinical decisions made when reviewing medical records. Algorithms were designed to allow access staff to gather relevant medical information from the patient.  Where appropriate, the staff could then schedule a new patient appointment for patients who did not require a more detailed review.  When developing the questions for the algorithms, the medical staff focused questions on information that the patient would readily know, for example, “Have you had surgery to treat your cancer?” and “Have you been told that your cancer has spread to other parts of your body?”

Additional tools were implemented to enhance the benefits of the algorithms, including:

  1. Scheduling priority and timeframe which indicated the timeline for scheduling appointments (i.e. within 5 business days).
  2. Physicians’ diagnosis preference list that identified which physicians see which diagnoses.
  3. Terminology guide that defined which diagnoses were included in the algorithm and common terminology used by patients and referring providers (i.e. types of brain tumors).
  4. Standardized medical record request by diagnosis which outlined the medical records needed for the appointment (This document is sent to the referring physicians office).

These documents were combined with the algorithms into a toolkit for the access staff to use throughout the new patient process.  They were also used to train new employees to the process and measure performance.

The team monitored the percentage of patients that were given an appointment within one business day of referral initiation as well as the cancellation rate for new patient referrals.

After the implementation of the algorithm toolkit, the Endocrine Center saw an increase in the percentage of patients with an appointment created within one day of the referral to 69%.  The cancellation rate for new patient referrals to the center reduced from 53% to less than 15%.

Based on the results of the pilot study, the project scope expanded to include the remaining access centers. The institutional rollout began in November 2014 and as of May 2016, 10 centers have been completed and four are in progress. Preliminary data analysis has shown a similar trend with an increase in the percentage of patients with an appointment created within one day of the referral.  The team continues to monitor this metric as well as the referral cancellation rate. In addition to improving patient satisfaction and creating a more consistent experience, the project is expected to aid in staff training and retention, provide physicians with more time to focus on patient care, and help the institution capture a higher percentage of targeted patients.

Organization: The University of Texas MD Anderson Cancer Center

Team Members (L to R):

Freddie Weiss (B.S., ISE, Georgia Tech)
Healthcare Systems Engineer, Quality Measurement and Engineering

Beth Garcia
Director, Process Improvement & Quality Education

Anita Ying, M.D.
Associate Professor, Endocrine Neoplasia and HD

Laura Burke
Healthcare Systems Improvement Specialist, Quality Measurement and Engineering

Gwen Tate
Clinical Administrative Director, Brain & Spine Center

Industrial and System Engineers provide incredible value to any organization in any industry and I am really excited to share these stories and inspire you and your company to hire ISE’s.

Blessings to you all!

Best Regards,
Michael Foss
President, Institute of Industrial and Systems Engineering

comment 0

AI and IISE – Why AI Can’t Do Our Jobs

As an engineer with experience in IT, I’ve watch artificial intelligence grow from spell checkers to the AIs running latent semantic indexing of all content indexed by Google.

Use of AI in product design via evolutionary mixing and competition is already being tried. And AI is getting used to solve problems in the real world. Artificial intelligence is being tasked to study the symptoms of unusual cases to try to find potential causes. Think of the rare syndromes that most doctors will never encounter, and the patients who visit dozens of specialists until getting a diagnosis. Worse are the ones who never receive a proper diagnosis and struggle through treatments for their symptoms because they don’t know the real root cause.

Will AIs get asked to solve manufacturing and process related problems by searching databases? Let’s look at the major reasons why it won’t in the near future.

Intellectual Property Concerns

Few companies are going to share their information on how they make their products, the problems they’ve encountered and how they solve these issues. Outsourcing to Asian nations with weaker IP protections has already lead to factories taking designs, work instructions and parts lists, carrying them to the factory across the street and making a rival product without paying the royalties owed.

Liability Concerns

Companies would prefer to bring in many experts bound by confidentiality agreements than use an artificial intelligence tool whose usage could be leaked like Wikileaks or reported by accident as part of someone’s exhibition of its abilities. They certainly won’t share data that admits potential liability on their part, and any examples shared will never point the finger at themselves. This will result in large omissions of useful data and skewed root cause results in queries against them.

Incomplete Data

The data most likely to be shared are the happy stories, the industry white papers that say how great your company did solving this problem. The failures that provide the most useful lessons learned, the advice of what not to do, is least likely to be shared unless the failures are long past. Many other lessons learned won’t be published at all because the companies that experienced these things have closed. The negative outcomes that are reported may be sanitized to minimize root causes that make them look bad. Incomplete data will limit the effectiveness of any data mining to solve manufacturing and process related problems.

Think about how long it took for the severity of medical errors to be fully realized, identified in 2016 as the third leading cause of death in the United States. Then imagine the under-reporting of data to make various companies look good, because they don’t have the public good of saving lives as a motivator for fully detailed and honest reporting of problems they’ve encountered along with the solutions.

Poor Quality of Data

We’d also face the risk of data quality affected by crediting the wrong solutions. For example, you’d see case studies discussing improved team function credited to diversity training instead of quality circles and inter-departmental knowledge sharing. The latest management fad would be receiving the credit instead of classic Lean engineering principles applied after value stream mapping and cutting the waste.

The ability to correlate the best solution would also be hindered by the renaming of classic problem solving solutions. In one interview, they asked I knew a CIP, continuous improvement project methodology. Yes, I know Six Sigma and completed dozens of projects. “But is it CIP?” I was asked. That this concept goes back to the plan-do-study-act cycle by Deming was irrelevant to the questioner. The name had changed, so it must be different and better. The mistaken belief that newer is better and novel is superior over the common leads to re-branding of the classic methods – and reports that credit the new methods management sent them to training to complete over the old names. The end result is that the AI could fail to associate all these different names for the same concepts going back decades to Dr. W. Edwards Deming and Frank Taylor.


Combine under-reporting of why problems occur and few reports of problems, combined with deliberate misclassification of causes to prevent public relations and liability concerns and incorrect identification of the solution, and it is unlikely artificial intelligence can be mined as effectively or well for solving industrial and system engineering problems as it is being used in other areas.

On the upside, it does mean industrial and systems engineers can look forward to a long and productive career even as AIs challenge many other areas of knowledge work.

comment 0

Meet Carl & Jake Kirpes

Industrial engineering success stories
A series of blogs presented by IISE President Michael Foss


ATTN: Business and company leaders whom aspire to excel,

Meet Carl & Jake Kirpes. Carl is a VP Operations GENESYS, B.S. Industrial & Mechanical Engineering & M.S. Systems Engineering from Iowa State University. Jake is a Business Strategist & Engineer at TPG Companies, B.S.E. and B.A. Industrial Engineering and Finance, University of Iowa.

How Industrial Engineering Saved a Company!

“Industrial and systems engineering has the ability to transform companies, industries, and society as a whole. In 2012, Carl Kirpes arrived at GENESYS to find a company fighting its way back out of the recession, but doing so without the tools that could dramatically increase the rate of success. GENESYS, an engineer-procure-construct firm that designs, builds, and installs manufacturing and assembly equipment/lines/facilities for blue chip manufacturers, had survived the downturn in the economy and was looking for innovative approaches to improve the business’s future. The application of industrial and systems engineering was the answer.

Although industrial and systems engineering has a long history in multiple industries, other industries such as healthcare and construction offer emerging opportunities for industrial and systems engineers. In such industries, application of basic industrial and systems engineering principles are multiplicative (rather than additive) in their enhancement of the business. At GENESYS, Carl applied basic industrial and systems engineering principles such as process flow mapping, establishing a work measurement system, and developing forecasting resource allocation methods to drive the book value of the company up by 400%, shift 80% of the projects to positive cash flow, and achieve over one million man hours without a lost-time incident.

You can read the full article about this transformation titled “How industrial engineering saved a company.”

Carl has been a member of the Institute for Industrial and Systems Engineers for five years. He has learned a great deal at the annual conferences and been able to apply the lessons learned to achieve increased results in his business and community ventures. In addition, he has met a number of other great individuals working across various industries with multiple job titles, all applying the concepts of industrial and systems engineering to do the same within their circles of influence. As a member of the Council on Industrial and Systems Engineering and the Chair of the Industry Advisory Board, Carl would be happy to talk with any individual interested in becoming more involved in how we can utilize industrial and systems engineering to transform industry as whole through the collaboration of each of our circles of influence. Carl can be reached at or through a message on LinkedIn at”

Industrial and System Engineers provide incredible value to any organization in any industry and I am really excited to share these stories and inspire you and your company to hire ISEs.

Blessings to you all!

Best Regards,
Michael Foss
President, Institute of Industrial and Systems Engineering

comment 0

Dilbert and the Art of Lean

In Scott Adams’ book “How to Fail at Almost Everything and Still Win Big”, he argues for the value of lean as a principle in everything from decision making to one’s systems that are used in business and private life.

Scott Adam’s advice on lean decision making begins with choosing the simple plan first and foremost if you have a choice. “If you can’t tell whether to pick a simple plan or complicated plan, choose the simple one … simple tasks are easier to manage and control.”

Scott Adams uses the term simplicity when industrial and systems engineers would say lean. But he’s spot on that simple or lean processes reduce risk simply by reducing the number of possible failure points in the decision making process. Add poke-yoke or mistake proofing to the plan, and you set up something akin to statistical process control chart lines that say “when it goes outside these lines, do X”.

“Human nature … we’re good at following simple systems and not so good at following complicated systems.”

This fact is too often overlooked in process flows. What do you do when you’re trying to fill out government forms and take it to all the right people for approval? What do you do when the process flow chart for a PDM system looks like a spaghetti chart? In any application where you can design (or are in the process of re-designing) a system, aim for simplicity. This means making user interfaces and the workflows the interfaces rely on as simple as possible.

Customer service processes need to be simple, with clearly defined off ramps for how and when someone is handled as an exception. The simpler processes mean shorter training times for staff and users, less confusion when exceptions arise and fewer opportunities for errors or abuse of someone going around the approve process just to get things done.

“Simple systems are probably the best way to achieve success. Once you have success, optimizing begins to have value.”

This truism is closely related to the saying that the most successful complex systems arise from successful simple ones. First make a simple system that works most of the time, then add on the off ramps for various exceptions or escalation points for when something doesn’t fit the standard workflow. Making a very complicated system to handle every exception or concern up from risks creating one that doesn’t work at all.

First, design a system that works right most of the time, one as simple as possible. When you have a lean process, it is easier to adjust and adapt to suit real world systems compared to a complicated one that you need to change. Only after you  have a system as simple as possible that works in the real world is optimizing a good option – and it is more likely to be effective because you aren’t trying to optimize a complex system that is inefficient because it wasn’t a good fit to the user’s needs in the first place.

“Successful people and businesses have the luxury of being able to optimize to perfection over time.”

The first result of this statement is that your perfect process isn’t perfect if you spend so much time trying to plan it that you never implement it. The second is that a very complex plan that fails in implementation is a failure. In contrast, think of all the successful businesses that resulted from a simple flow chart/diagram on a napkin in a restaurant, such as Southwest Airlines drawing the triangle on the napkin saying we’ll fly from Dallas to San Antonio/Austin to Houston. When you have the simple working system, you’re working – and generating revenue. When you have the money flowing in, you have the resources to devote to improving the process or expanding on it. When you are struggling for survival, perfection is a luxury.

That concept is why Lean has grown in importance over Six Sigma and other quality standards since the Great Recession started – perfect is a luxury, while using less material and labor is a cost saving measure and eliminating waste in operations may increase production for relatively money. Scott Adams concurred with this by stating in his  book, “Another advantage of simplicity is that it frees up time, and time is one of your most valuable resources in the world.”

Lean’s attraction is the potential savings and impact on the bottom line, whereas spending money to make better items or services in a market that may not pay more for the higher quality doesn’t make economic sense.