Read Benchmarking text version

R i c k F o r d , b s , r r t, f a a r c

The Pitfalls of Benchmarking

EDITOR'S NOTE This is the first article in a new column designed by the AARC Benchmarking Committee to inform and educate both RC staff and managers on the use and benefits of benchmarking the delivery of respiratory services.

Benchmarking in respiratory care can be misinterpreted as the units of service are not well defined and qualitative information about the department or nature of services is not readily available.

lab was not feasible and there were "problems" with the way comparative data was being captured and reported. While our administration realized reasonable budget decisions could not be made based on the benchmarking program being reported for respiratory care, others may not realize when such data is suspect. In some situations this can result in making decisions that place the provision of quality services at great risk. This scenario unfortu-

tion are labor intensive, subjective, and often confusing. Further, existing benchmarking products may not gather respiratory care demographic data that is specific enough to enable the manager to select appropriate similar facilities for comparison. In fact, comparison groups utilized by hospital finance departments are typically set up using hospital demographics rather than the characteristics of respiratory care departments.

30

AARC Times

May 2006

"

So what's an administrator to do? Labor, pharmaceutical, and supply costs spiral upwards while reimbursement is unable to keep pace with inflationary pressures in health care. Hospitals struggle with difficult decisions; but when the cost per patient day exceeds net revenue per patient day, something has to change. In the past 15 years, hospital decisionmakers have increasingly relied on benchmarking measures to assess where and how deep to cut labor expense. In addition, consultant firms typically use benchmarking metrics to identify opportunities to reduce both staff and supply costs. The good news is, the era of across-the-board cuts is largely a thing of the past. The bad news is, the utilization of benchmarking data can result in even deeper cuts to wellrun respiratory care departments. I recently attended one of those budget-planning meetings where we learned about the tenuous financial condition my organization was anticipating. We were informed that next year poses the challenge of several million dollars of expense reductions and that benchmarking data would be a key tool in determining departmentspecific cuts. The administrator

distributed a benchmarking summary report of "savings opportunity" based on bringing our paid hours per unit of service metric to the 50th percentile within our compare group. For respiratory services, the data indicated we should eliminate 48 of our 56 FTEs; and for the pulmonary function lab, eliminate 6.7 FTEs in a department that only employs 5.5 FTEs. It was apparent that the "savings opportunity" for both respiratory services and the pulmonary function

nately is played out across the country, and caregivers are paying the price when misdirected reductions in resources are made.

T h e p ro b l e m

Benchmarking in respiratory care can be misinterpreted easily as the units of service are not well defined, qualitative information about the department or nature of services is not readily available, and mechanisms to collect accurate informa-

"

Benchmarking for Success

M o re p ro b l e m s

Metrics that use patient days as a driver of expenses have been a disservice in benchmarking respiratory care, considering the clinical acuity and demand for respiratory services varies greatly from patient to patient. Benchmarking firms have, therefore, shifted to counting billable services and/or capturing specific bedside activity. Multiple concerns have emerged in the attempts to count these "units of service." The issues range from the inconstant application of Current Procedural Terminology (CPT) coding, variations in the billing system, the inability to capture activity not charged, and the failure to capture the information being requested. In addition, even when the data is available and captured, it is often reported differently between centers. A primary example is in the reporting of continuous procedures. Most benchmarking firms request that data be reported as "patient days"; how-

ever, charge systems may capture continuous modalities per hour. In such cases, the hours are converted to days by dividing the total by 24, resulting in a significant undercounting of days reported.1 While some departments (who bill continuous modalities by hour and convert by dividing by 24) will undercount their activities, other departments bill per hour and simply do not convert their hourly information into days. They will report actual hours of continuous services rather than days and will grossly over report continuous units. Another issue that becomes apparent when assessing benchmark data is that those centers that are the most progressive in achieving efficiencies and best practices have difficulty "stacking up" to those who have not adopted new practices. For example, the department checking ventilators every two hours will report 12 units of service for each ventilator day, while the progressive department that has redesigned the care delivery model

and checks ventilators every eight hours will report only three units of service in the same period. The department that has protocols fully implemented may perform only six to eight CPT treatments on a patient over their post op course, while the department without protocols may perform 16 to 20 CPT treatments in a similar situation. Doing a lot of unnecessary and frequent activity can produce a large number of "units of service," resulting in a lower cost per unit for a department that is actually less efficient. Achieving new efficiencies, lowering cost, and improving quality is not always apparent in the metrics being reported to your administration. In order to sort out these situations, you need access to your compare group's profiles and the actual data they reported. Unfortunately, the commercial systems may not make it easy to access this detail; and it becomes difficult to determine if someone is really doing something better or if there is a

Test Drive the AARC Benchmarking Program for Free

Department managers are invited to register for and use the AARC Benchmarking Program free of charge until July 1, 2006. This is a great opportunity to take this extraordinary system for a test drive at no cost. To register for the Benchmarking Beta Testing Group, please go to this page: http://services.aarc.org/ scriptcontent/CM_Application_Bench/CMA_Login.cfm. Start with your AARC member number Since you are an AARC member, simply enter your AARC member number. Your record will be accessed and will be associated with a new record that you're creating for your facility. This new number, called the "Benchmark ID," is your facility's ID number for this project and will be your login to access the benchmarking program. You will be assigned a password, which will be your login plus your first and last initial. Read your e-mail First, you will receive an e-mail confirming your registration from the AARC. Second, you will also receive an e-mail from DeVore Technologies (the developers of the technology platform for this program). This e-mail will confirm that they have received your registration data and will provide you with the URL where you can access the benchmarking program using your assigned login and password. Also, it will have an attached pdf file containing a "Benchmarking User's Manual." This manual provides you with step-by-step instructions on how to use the program to: · Create/edit your hospital profile · Enter your benchmarking data · Set up comparison reports utilizing comparison groups · Generate reports using selected comparison groups. Plus, you will receive a spreadsheet to help you organize your data for entry and instructions from DeVore about reporting problems or requesting assistance while we are in the beta-testing phase using their "Bug Tracker" program. For more information, contact Bill Dubbs at [email protected] aarc.org or (972) 243-2272. ·

32

AARC Times

May 2006

Benchmarking for Success

perfectly good explanation for the variance.

A new tool for managers

The AARC Benchmarking Project is underway to address many of the current concerns. The purpose of the project is to develop a Web-based tool to facilitate the collection and dissemination of benchmarking information among respiratory care departments in hospitals and other health care organizations. Specific goals of this project include, but are not limited to providing, the following: · A means for collecting, comparing, and reporting data on key metrics of operations, productivity, and patient care outcomes to identify benchmarks and their underlying best practices

· Access for subscribers of the service to "User Profiles" that detail the structure and practice of other departments · A nexus for communication related to quality improvement activities among users · Educational resources on the subject of benchmarking and process improvement along with links to related sites · Resources for data mining to support related research efforts. The Benchmarking Committee consists of Karen J. Stewart, MS, RRT, FAARC; Wade Jones, BS, RRT, FAARC; Robert Chatburn, BS, RRT-NPS, FAARC; Rick Ford, BS, RRT, FAARC (chair); and Bill Dubbs, MHA, MEd, FAARC (AARC staff liaison). Keep a watch at www.aarc.org and

in future issues of AARC Times for additional information on the AARC Benchmarking Program and how to utilize the information to achieve best practice in the provision of respiratory care services. ·

Rick Ford is director of respiratory services at the University of San Diego Medical Center in San Diego, CA, and chairs the AARC Benchmarking Committee.

REFERENCE 1. Chatburn, R.L., & Ford, R.M. (2006). Procedure to normalize data for benchmarking. Respiratory Care, 51(2), 145-157.

AARC Times

May 2006

33

Information

Benchmarking

3 pages

Report File (DMCA)

Our content is added by our users. We aim to remove reported files within 1 working day. Please use this link to notify us:

Report this file as copyright or inappropriate

4929


Notice: fwrite(): send of 211 bytes failed with errno=104 Connection reset by peer in /home/readbag.com/web/sphinxapi.php on line 531