While healthcare has been evolving for centuries, there has been no accelerator quite as fast as the addition of health technology. Let’s take a little journey through time to review history – and maybe even peer into the future a bit.
Medical discoveries such as the stethoscope, autoclave, human genome project, and vaccines have had a profound impact on patient outcomes. However, the integration of computer technology has taken healthcare to new heights, transforming not only the way we practice medicine, but also how we collaborate, research, share information, get paid, and even spend our time at work. In fact, it’s hard to imagine what healthcare was like before computers became an integral part of our lives.
A Little Trip Down Health Technology Memory Lane
1960’s
One thing medical practices did before computers was spend a lot of time filing and retrieving information.
Beginning in the 1960’s computers began to enter the scene in health information management (HIM) as a way to standardize and share medical records. Part of this need for standardization was driven by the introduction of Medicare and Medicaid in 1965, which necessitated good record keeping for payments. At this point, computers were huge room-sized mainframes, and expense prevented all but a few of the largest institutions from using them. One of the first adopters was the Mayo Clinic.
1980-2000
These record-keeping systems, while useful, were to soon be completely revolutionized with the advent of the internet. Over the next two decades, from 1980 to 2000, the internet evolved from the first website in Switzerland to AOL and 56K modems to laptop portability and wireless technology.
Some of the important developments in medicine that can be attributed to the internet are:
- Widespread access to health information. Physicians were not the only sources of information – WebMD and other sites provided a broad range of topics for consumers to learn from. A part of this evolution was the process of learning to discern accurate information from bad sources.
- The Master Patient Index (MPI) was introduced in the 1980’s to keep track of patient data. Other standardization systems soon followed, including the Healthcare Information Exchange (HIE) still in use today.
- In 1994 the World Health Organization (WHO) adopted the ICD-10 coding system, standardizing diagnosis codes for medical records (and billing).
- In 2000, the Human Genome Project begins, accelerating the rise in personalized medicine that we are seeing now. Mapping and identifying all 25,000 genes in the human genome was a task that would be insurmountable without technology.
These two major sources of information laid the foundation for future developments in healthcare.
2000
The Institute of Medicine (IOM) estimated that “between 44,000 and 98,000 hospitalized Americans die each year as a result of preventable medical errors” in their published report To Err Is Human. One of the immediate recommendations from the report included “Error-Reducing Technology” as they found that “computerized physician order entry (CPOE) can reduce errors by 55 to 86 percent.” This helped prompt the government to support planning to give Americans access to electronic health records within 10 years.
2008
Electronic health records were beginning to become more common, with varying degrees of success. Some issues identified were lack of security, high cost, and lack of consistent standards. While they certainly saved on filing costs and made records more searchable, the true benefits of digital information had yet to be realized.
2009
The American Recovery and Reinvestment Act (ARRA) added to Meaningful Use by focusing on improving the affordability of healthcare, and set a goal to adopt electronic records by 2014,
The Health Information Technology for Economic and Clinical Health (HITECH) Act was enacted as part of ARRA. It addressed privacy and security concerns associated with electronic transfer of PHI and strengthened the civil and criminal support of HIPAA rules.
Health Information Technology (HIT) technology is promoted by ARRA, focusing on HIT testing, grants, and loans funding programs, as well as provisions for monetary incentives through CMS.
2015
By 2015, EHR adoption was nearly universal, with 96% of hospitals and 87% of physician practices using electronic medical records. However, there were still issues with data sharing, safety, and preventing medical errors; bringing the topic of interoperability to the front.
Where We Are Now
The healthcare system has now become an ecosystem – with a balance that relies on health technology to thrive.
- Portability of healthcare data and interoperability are still challenges, especially as telehealth becomes more common and patients are not as tied down to one organization.
- Tools to analyze big data are becoming more common as AI and machine learning applications enter the market.
- Patients and clinicians want access to health records on the go – via any device, anywhere, and at any time.
- EHR interfaces should be more intuitive, with data flowing smoothly from one application to another via secure APIs.