Healthcare colored with blockchain’s open-source foundation

Self-sovereignty and identity anonymity hold the code to unlock the potential for blockchain to change patient health. The architecture of the internet has changed forever.

Technological change forces economic growth. Technology extends the science of discovery and produces artifacts used in everyday life. It’s the small technical discoveries that make larger scientific endeavors possible. It’s also these seemingly unrelated breakthroughs that make their way into our daily lives.

Apparently, insignificant discoveries become significant

In the 1960s, NASA conducted an extensive test program to investigate the effects of pavement texture on wet runways. The goal was to better understand braking performance and reduce the incidences of aircraft hydroplaning. The result of years of technical scientific studies was that, in 1967, grooving of pavement became an accepted technique for improving the safety of commercial runways.

One of the first runways to feature safety grooving was the Kennedy Space Center’s landing strip. However, the applications of this technique extended well beyond NASA facilities. According to the Space Foundation, safety grooving was later included on such potentially hazardous surfaces as interstate highway curves and overpasses; pedestrian walkways, ramps and steps; playgrounds; railroad station platforms; swimming pool decks; cattle holding pens; and slick working areas in industrial sites such as refineries, meat-packing plants and food processing facilities.

If you asked a cattle rancher in 1970 if his work would be affected by NASA’s research on braking patterns exploring ground vertical load, instantaneous tire ground friction coefficient or free-rolling wheel angular velocity, the answer would probably have been an emphatic “not a chance.” Likewise, if you had told the workers on a road crew in 1970 that they’d be spending many years of their lives adding grooves to the surfaces of existing highways, bridges, and exit ramps, their response would have been less than welcoming. It would have been impossible to convince these professionals of the coming changes.

The impact of technology on daily life starts with scientific and technological discoveries that initially appear isolated or narrow in context. But we know better.

5 MIT projects to watch

The MIT Internet Trust Consortium, established in 2007, focuses on developing interoperable technologies around identity, trust and data. The consortium’s mission is to develop open-source components for the Internet’s emerging personal data ecosystem in which people, organizations, and computers can manage access to their data more efficiently and equitably. The goal is to build emerging personal data ecosystems for individuals and organizations. That ideological desire fits in nicely with the growth of blockchain technologies.

Currently, there are five cutting-edge MIT projects that could change the future of the internet: MIT ChainAnchor, (permissioned blockchains), Project Enigma (autonomous control of personal data), OpenPDS2.0 (a personal metadata management framework), DataHub (a platform with the ability collaboratively analyze data) and Distributed User Managed Access Systems (DUMAS) (a protocol for authorizing and accessing online personal data).

The white papers for each project are interesting to read. When thinking in a healthcare mindset, it’s easy to think of their applications to health and wellness. Here are links to the project white papers:

MIT ChainAnchor

The proposed permissioned blockchain system is in contrast to the permissionless and public blockchains in Bitcoin. The system addresses identity and access control within shared permissioned blockchains, providing anonymous but verifiable identities for entities on the blockchain.

When applied to healthcare ChainAnchor could, for example, enable participants of a medical study to maintain their privacy by allowing them to use a verifiable anonymous identity when contributing (executing transactions on the blockchains).

Project Enigma

Enigma is a peer-to-peer network that allows users to share their data with cryptographic privacy guarantees. The decentralized computational platform enables “privacy by design.” The white paper says that, for example, “a group of people can provide access to their salary, and together compute the average wage of the group. Each participant learns their relative position in the group, but learns nothing about other members’ salaries.” Sharing information today is irreversible; once shared a user is unable to take that data back. With Enigma, data access is reversible and controllable. Only the original data owners have access to raw data.

In the context of healthcare, patients could share information regarding personal genomics linked to disease registries and clinical treatments aligned to healthcare outcomes, knowing that their original data was not shared.

OpenPDS2.0

OpenPDS introduces SafeAnswers, an innovative way to protect metadata (application, document, file or embedded) at an individual level. As the white paper explains, SafeAnswers, “allows services to ask questions whose answers are calculated against the metadata instead of trying to anonymize individuals’ metadata.” SafeAnswers gives individuals the ability to share their personal metadata safely through a question-and-answer system. Previous mechanisms for storing personal metadata (cloud storage systems and personal data repositories), don’t offer data aggregation mechanisms. The challenge is that once access is enabled, the data is broadly accessible. SafeAnswers reduces the dimensionality of the metadata before it leaves the safe environment, therefore ensuring the privacy of data.

Healthcare metadata examples could include patient account number, patient first and last name, and date of admission. Healthcare research could benefit from using aggregated metadata from patients without sharing the raw data. Research entities would send questions to an individual’s personal data store (PDS). This PDS would respond with an answer. Today, if metadata information was provided to researchers or accessed from a phone application, the patient could disable (uninstall) the app but wouldn’t know what information was shared. With the SafeAnswers system, a patient potentially would use their PDS URL to provide access to the health app. All of the patient’s metadata, therefore, could be tracked and recorded — visible to the patient. Later the patient could access metadata the application was using to create patient inferences. Also, the patient could permanently remove the metadata the application was consuming by either limiting or permanently restricting future access. No trusted third party. No entity to monitor access. Anonymously shared data.

Discoveries that transform society

The DataHub project and the Distributed User Managed Access Systems (DUMAS) projects offer additional pieces to solve the challenge of exchanging information while maintaining identity anonymity. Maybe they can apply to healthcare – if we’re creative.

Highly technical advances have shaped the social economy for centuries. The creation of the sickle, a handheld agricultural tool with a curved blade typically used for harvesting grain crops, had a profound impact on the Neolithic Revolutions (Agricultural Revolution). Who would have imagined when it was invented (18000 to 8000 B.C.) that the sickle would form the basis for modern kitchen knives with serrated edges.

Small, seemingly insignificant discoveries transform societies. How blockchain technologies will affect people on a daily basis is awaiting to be discovered. When blockchain applications enhance our lives, they may become as commonplace as highway safety grooving.

Universities must build innovators in new ways

The future doesn’t care how you became an expert. The experts of tomorrow will learn differently.

Whether you’ve completed your undergraduate 15-years ago or are comfortable with a newly minted Ph.D., how you will learn is changing. The experts of tomorrow will learn differently.

Massachusetts Institute of Technology, Stanford University, Harvard University, and the University of Washington are the world’s most innovative universities according to Times Higher Education. Reuters agrees, they all fall into the Reuters Top 100 World’s Most Innovative Universities. But there’s a secret. Technology is changing how all students learn and how every school inspires and grows students. Whether a student is just starting the undergrad journey or a Ph.D. going back for some refresher classes, thinking creatively and driving disruptive ideas is at the heart of many university programs.

Let’s explore how experts are created and technological impact to learning.

OpenCourseWare

MIT OpenCourseWare (MIT OCW) is an initiative of the Massachusetts Institute of Technology (MIT) to put all of the educational materials from its undergraduate- and graduate-level courses online, freely and openly available to anyone, anywhere. 

edX hosts MITx. MITx courses (more commonly MIT MOOCs) embody the inventiveness, openness, rigor and quality that are hallmarks of MIT, and many use materials developed for MIT residential courses in the Institute’s five schools and 33 academic disciplines.

Open courses are a trend across academic institutions that is rapidly growing. What will become the differentiator when courses are free to access? Universities need to innovate.

Nanodegrees

The on-demand economy is affecting education with collaboration based approaches, to learning through communities. The Khan Academy offers the ability to learn anything. For free. Gibbon, allows people to share knowledge by creating playlists with the best stuff from the web, and they recently joined degreed. Degreed’s vision is that the future doesn’t care how you became and expert. Skillshare, allows students to learn a new skill each day and learn creative skills in just 15 minutes with bite-sized lessons. And Udacity already has nanodegree programs in data science, machine learning, and Android, iOS.

Many nanodegree programs incorporate the idea of digital badges. Specialization becomes more important after a general framework of the broader subject has been established.

Digital badges

Specialization brings us to digital badges. A digital badge is a validated indicator of accomplishment, skill, quality, or interest that can be earned in many learning environments. Open digital badging makes it easy for anyone to issue, earn, and display badges across the web, through an infrastructure that uses shared and open technical standards. Community-based learning might not be profitable for everyone. Do digital badges threaten the higher-education monopoly on credentials?

The on-demand economy is uniting communities – communities that learn together. The Humanities, Arts, Science, and Technology Alliance and Collaboratory (HASTAC) is an alliance of more than 14,000 humanists, artists, social scientists, scientists, and technologists working together to transform the future of learning. They concentrate on three areas: scholarship programs, learning competitions, and publication projects. This global community shares ideas.

OpenBadges enables students to get recognition for skills they have learned anywhere. Open Badges allows students to verify skills, interests and achievements through credible organizations. And because the system is based on an open standard, students can combine multiple badges from different issuers to tell the complete story of your achievements – both online and off.

Technology is changing not only how we learn, but how we process knowledge and the methods used to develop tomorrow’s innovators.

Creating experts

What is your profession today? Within what field do you consider yourself an expert?

Think for a minute about your personal path to becoming an expert. It could have been through traditional undergraduate and possibly graduate studies, augmented with work experience. That would work.

However, knowledge could have come from mentoring or non-traditional avenues such as a co-op experience (diversifying experience with school), Thiel Fellowship (Peter Thiel, co-founder of PayPal created a two-year program to build now, what a traditional education won’t teach), taking a gap year (a year off, to accelerate learning similar the specialization of a nanodegree), you may have enrolled in Massive Open Online Courses (MOOC), or one of the many alternatives. Those approaches also work.

Navigating knowledge

Imagine you’re in the office and are responsible for establishing your company’s new strategic direction. A rampant business model threats to disrupt your organization. This new business model shift has the potential to weave through the organization’s entire operational model and hooks into sophisticated technology ecosystems spanning dozens of geographic locations. The CEO introduces you to an expert in platform dynamics. This new leader is your peer to tackle this blue-chip, mega-program. This recent addition to the leadership team understands two-sided markets, multi-sided markets, and has studied extensively their effect on global competition taking a macroeconomic view. After working with this person for three months, you’re impressed with their vision and knowledge of how to apply innovation to dynamic business models.

Ask yourself a question: Do you care how they became an expert? Most executives don’t care, and like myself will rate leaders on the merits of their knowledge not where they acquired it. Universities and higher-education institutions need to ideate on their role in the collaborative economy and how students become experts. 

The experts of tomorrow will learn differently.

If I only had 5 minutes to explain blockchain

Are you struggling to understand how blockchain works, why blockchains are secure or why blockchain technologies will transform the world? Here’s a five-minute answer to all those questions.

As a leader in your organization, you have a duty to your organization to understand why blockchain technology will transform the economy.

Why was blockchain started?

The bursting of the U.S. housing bubble underpinned the global financial crisis of 2007-08 and caused the value of securities linked to the U.S. real estate to nosedive. Easy access to subprime loans and the overvaluation of bundled subprime mortgages all leaned on the theory that housing prices would continue to climb.

Ultimately, the Financial Crisis Inquiry Commission determined the entire financial crisis was avoidable and caused by widespread failures in financial regulation and supervision. There were many reasons for the financial crisis, including subprime lending, the growth of the housing bubble and easy credit conditions. The world believed that “trusted third parties” such as banks and financial institutions were dependable. Unfortunately, the global financial crisis proved intermediaries are fallible. The crisis resulted in evictions, foreclosures and extended unemployment; it was considered the worst financial crisis since the Great Depression.

In response to this horrible global financial upheaval, in 2008, Satoshi Nakamoto wrote a paper titled, “Bitcoin: A Peer-to-Peer Electronic Cash System.” The paper suggested that “trusted third parties” could be eliminated from financial transactions.

What’s Bitcoin and how does it relate to blockchain?

Bitcoin is a peer-to-peer system for sending payments digitally signed to the entire Bitcoin network. When the “b” is capitalized, “Bitcoin” refers to that network, e.g., “I want to understand how the Bitcoin network operates.” When the “b” is not capitalized, the word “bitcoin” is used to describe a unit of account or currency, e.g., “I sent 1 bitcoin to a friend.” The digital signature is made from public keys (given to anyone for sending assets) and private keys (held by the asset owner).

The public ledger of Bitcoin transactions is called a blockchain. Bitcoin also runs on top of a technology called blockchain. Blockchains are permissionless distributed databases or permissionless public records of transactions in chronological order. Blockchain technology creates a decentralized digital public record of transactions that is secure, anonymous, tamper-proof and unchangeable — a shared single source of truth. Blockchains apply to any industry where information is transferred and roughly fall into the following six classifications:

1. Currency (electronic cash systems without intermediaries).

2. Payment infrastructure (remittance; sending money in payment).

3. Digital assets (exchange of information).

4. Digital identity (IDs for digitally signing to reduce fraud).

5. Verifiable data (verify the authenticity of information or processes).

6. Smart contracts (software programs that execute without trusted third parties).

How blockchains work

For the first time in history, blockchain removes — or disintermediates — the middleman from business transactions and by doing so improves the value of existing products, services, and interactions in the following ways:

Preventing double spending: With blockchain, you can’t spend money more than once. Blockchain presents a solution by ensuring the authenticity of any asset and preventing duplicate expenditures (real estate, medical claim, insurance, medical device, voting ballots, music and government record or payments to program beneficiaries).

Establishing consensus: In this new model, crowds are networks of computers that work together to reach an agreement. Once 51% of the computers in the network agree, “consensus” has been reached and the transaction is recorded in a digital ledger called the blockchain. The blockchain contains an infinite ordered list of transactions. Each computer contains a full copy of the entire blockchain ledger. Therefore, if one computer attempts to submit an invalid transaction, then the computers in the network would not reach consensus (51% agreement) and the transaction would not be added to the blockchain.

There are four principles of blockchains networks.

  1. Distributed: Across all the peers participating in the network. Blockchain is decentralized, and every computer (full node) has a copy of the blockchain.
  2. Public: The actors in a blockchain transaction are hidden, but everyone can see all transactions.
  3. Time-stamped: The dates and times of all transactions are recorded in plain view.
  4. Persistent: Because of consensus and the digital record, blockchain transactions can’t catch fire, be misplaced or get damaged by water.

Steps to create a block (transaction)

Blocks are a record of transactions and chains are a series of connected transactions (blocks).

  1. Create transaction: A miner (computer) creates a block.
  2. Solve the puzzle: A miner (computer) must do mathematical calculations and if correct will receive a “proof of work.”
  3. Receive “proof of work:” If the puzzle is solved — the “proof of work” is a piece of data that is difficult (costly, time-consuming) to produce but easy for others to verify and which satisfies certain requirements. In short, it’s difficult to solve the puzzle but easy to verify it’s solved correctly.
  4. Broadcast “proof of work:” The miner broadcasts its successful proof of work to other miners.
  5. Verification: Other miners verify the “proof of work.”
  6. Publish block: If the miners reach consensus (51% agreement) that the proof the miner presented solved the puzzle, then that transaction is published to the blockchain.

Why are blockchains secure?

With blockchain technologies, truth can also be measured and consumers and producers can prove data is authentic and uncompromised.

To create a new block, block 101, some of the data is used (or a hash is created by an algorithm that turns an arbitrarily large amount of data into a fixed-length random number) from the previous block, block 100. Then to create the new block 102, information from block 101 is used, and so on. Transactions are dependent on the prior transaction. Similar to a light string on a Christmas tree. If a light bulb were pulled from the string (changing a transaction), the miner would have to change every previous transaction ever made in that string. Probabilistically this is almost impossible, as not everyone would reach consensus on the proposed change.

The result is an immutable digital record for every agreed transaction: a single source of truth.

Why blockchain technologies will transform the world

Blockchain technologies will improve trust in industries where information (assets) is transferred, including these:

  1. Accounting (auditing and fraud prevention).
  2. Aerospace (location of parts and chain-of-custody).
  3. Energy (smart metering and decentralized energy grid).
  4. Healthcare (medical devices and health information interoperability).
  5. Finance (remittance and currency exchange).
  6. Real-estate (deeds transfer and speed buying or selling property).
  7. Education (better manage assessments, credentials, transcripts).

Blockchain technologies will change everything — from clothes you wear, the food you eat and even the products you buy.

Peter B. Nichol, empowers organizations to think different for different results. You can follow Peter on Twitter or his blog Leaders Need Pancakes. Peter can be reached at pnichol [dot] spamarrest.com.

How the sharing economy is shaping the future of work in healthcare

How we work, how we earn and the skills required are taking on a shared purpose.

Shifting from hyper-consumption to collaboration consumption has given a renewed belief in the value of reputation, community, and shared access.

Driving force multipliers

How are organizations going to learn? Organizations – both big and small – will need to adapt to new challenges to survive.

  1. Redefinition of social capital: personal, not corporate brands are determining business relationships
  2. Redistribution markets: unwanted or underused goods resold
  3. Collaborative lifestyles: non-product assets such as space, skills, and money are exchanged and traded in new ways
  4. Product service systems: pay to access a product or service without ownership

Redistribution markets, collaborative lifestyles, and product service systems have spurred the rise of consumptive collaboration – new shared reinvested through technology. Traditional sharing, bartering, lending, trading, renting, gifting, and swapping redefined through technology and peer communities bloomed into the sharing economy.

This collaborative economy places value on a combination of reputation, community, and shared access. Underutilized assets and resources are offering alternatives making space for on-demand platforms that reach critical mass based on the efficiency of crowds and the trust of communities.

The competing forces and evolving priorities have created a new world born on the back of business fragmentation feeding off the influence of social environments. Here technology breakthroughs are the norm and resource scarcity is a driver in the global shift of power.

Collaborative Economy

The billion-dollar club

Today every industry is getting involved in the sharing economy. Today companies that make up hospitality (FeastlyLeftoverSwap), transportation (LyftZipcar), consumer goods (EtsyPoshmark), entertainment (SoundCloud,Pandora), healthcare (MedZedHeal), logistics (InstacartUber Rush) and odd jobs (FiverrUpwork) all contribute to form the sharing economy. Given the surge of sharing companies, it might not be surprising that Uber’s valuation of USD $62.5 billion is 2.43 times Southwest Airlines at USD $25.7 billion and T. Rowe’s latest valuation lifts Airbnb’s assessment to USD $25.5 billion at 1.48 times the Marriott International at USD $17.2 billion.

The collaborative economy is here to stay. HBR published a great piece titled, What Customer Want from the Collaborative Economy, that stated, “We now have research to show that companies need to embrace the core innovations of the collaborative economy if they want to thrive in the era of KickstarterUber and TaskRabbit.” Maybe we’ll all be relegated into three simple worlds: Orange (small is beautiful), Green (companies care), or Blue (corporate is king) as suggested by PwC in The future of work: a journey to 2022 report. The future of work will involve companies that innovate around their core values.

The future of work involves learning (UdacityChegg), municipal (Musketeer,MuniRent), money (bitcoinCircleUp), goods (yerdleshapeways), health and fitness (VINTMedicast), space (HomeAwayShareDesk), food (VixEatBlue Apron), utilities (vandebronfon), transportation (Ola ShareDriveNow), services (CloudPeepsFiverr), logistics (nimberdeliv), and corporate (warpitTwoGo). The makers, co-creators, crowd funders, peers, and companies that are successful all empower people. The sharing economies are creating partnerships between traditional incumbents and bleeding edge tech companies.

Healthcare plunges into the sharing economy

Healthcare companies are launching a flurry of interaction applications to capture the attention of patients and their loved ones.

  • Ease delivers medical marijuana delivered in minutes or less. Think of a slick decision support tool for medical marijuana. Ease offers a high quality, lab-test menu, with fast and convenient delivery with technology that provides an experience for patients that is safer than the alternatives.
  • Helparound addresses the daily struggle of chronic patients on caregivers. Part of helparound is Diabetes Helpers, is a help network on mobile and desktop where people help each other navigate life with diabetes. People with type 1, type 2 diabetes, gestational diabetes and their caregivers answer each other about the symptoms of diabetes, how to lower their a1c, and learn more about diabetes diet and management.
  • Stat provides on-demand doctors, medical care, medical transport and companionship. Stat is healthcare on-demand and with a simple push of a button, patients can reach a doctor, CNA, HHA, or a medical transport in minutes for themselves or cared for loved ones.
  • Popexpert, gives users an opportunity to learn life and work skills directly from top experts to be happier, healthier, and more productive. From getting fit to staying healthy popexperts, has the latest in life, work, and play.
  • Medicast, is helping hospitals and health systems bring back the house call. This new platform offers care delivery for the on-demand age. Medicast helps hospitals and health systems modernize their care delivery networks with sophisticated, easy-to-use technology that has been designed in collaboration with patients and physicians.

The sharing economy is bringing people together. In our small and beautifully connected world – reputation, community, and shared access matters.

How tech is reshaping work values and goals

The fourth Industrial Revolution merges the physical, digital and biological worlds. The need for human connections has never been greater.

Companies can find talent. They just can’t attract it. Fifteen years ago, corporate was king, and bigger was better. Companies such as Bank of AmericaQualcommCisco SystemsIntel, Sun Microsystems and Merck were top companies desired for employment. That mindset has evaporated. Today StrykerBaptist Health South FloridaWorkday and Genentech are among the 100 best companies to work for, according to Fortune Magazine.

Small is beautiful

Worker preferences are changing. It’s no longer good enough to tout social responsibility and entice talent with the sparkle of high earnings. Employees want flexibility and inclusion in decisions affecting their future. Companies are fracturing, and disintermediation isn’t only an outside force pressing upon companies, it’s a force creating disruption inside-out. Today’s workers are looking for a corporate family, a group of like-minded individuals that share beliefs and hold similar values. Business is getting personal. Large enterprises aren’t able to complete.

According to the MIT Sloan School of Management, the roles of startups and big business are shifting. In 2016, 15 to 20 percent of MIT graduations join startups. According to Vladimir Bulović, associate dean for innovation and a professor at MIT. Interestingly, 10 years ago only about 1.5 percent joined startups. The primary reason for the swing is that innovation has shifted to smaller companies. These little companies house tight groups of individuals who are committed – the seduction of the modern startup. Low pay, horrible hours, with a slight chance of changing the world. Interested?

Three decades of progress

The influx of freelance and contract workers into corporate America is changing the landscape of work. Job polarization, although a relatively new economic term started with an investment in robotics, removing middle-skills jobs relocating many jobs overseas. The effect of job polarization has resulted in a sharp reduction of middle-class jobs. These jobs are classified as moderate-skill level when compared to low-skill and high-skill jobs. When observing the index of computing over the last thirty years, there have been tremendous advances in computer power, cost per unit, labor cost per unit, cycles per second, and rapid memory.

Compute Power

 

Significant growth in computing power, performance, and productivity growth began in the mid-1940s. Moore’s Law observed in 1965 by Gordon Moore, co-founder of Intel, started a trend that would last for decades. This surge in productivity accelerated during 1969 to 2004 when the price index for computers fell by 23 percent about the GDP price index as presented by the Bureau of Economic Analysis (BEA). The Future of Work in the Age of the Machine, by The Hamilton Project, illustrates the exponential gains in computer buying power between 1980 to 2010.

American Economic Review published, the paper titled, The Growth of Low-Skill Service Jobs and the Polarization of the US Labor Market by David H. Autor and David Dorn. Their hypothesis found that “a critical role for changes in labor specialization, spurred by automation of routine task activities, as a driver of rising employment and wage polarization in the United States and potentially in other countries.” The structure of jobs is changing. Jobs are moving away from middle-skill roles and branching toward low-skill and high-skill roles.

Fourth Industrial Revolution

The future of work is also broader than collaboration technologies. We are on the brink of a new industrial revolution. The first Industrial Revolution was driven by steam and mechanical production equipment in 1784. The second Industrial Revolution was the mass production of electricity and divisions of labor in 1870. The third Industrial Revolution automated production with electronics and information technology in 1969.

What will be the fourth Industrial Revolution? The Internet of Everything (IoE), robotics, sharing economy, cyber-physical systems, nanotechnology, biotechnology, materials science, energy storage and quantum computing all compete for the new title. Whatever it is eventually labeled, the fourth Industrial Revolution merges the physical, digital and biological worlds.

Professor Klaus Schwab was born in Ravensburg, Germany in 1938. He is Founder and Executive Chairman of the World Economic Forum, the International Organization for Public-Private Cooperation. In January 2016, he published The Fourth Industrial Revolution, and steps through impact of the revolution:

  1. Economy: growth, employment, and the nature of work)
  2. Business transformations: consumer expectations, data-enhanced products, collaborative innovation, new operating models
  3. National and global changes: governments, countries-regions-and-cities, international security
  4. Individual disruption: identity, morality, and ethics

The need for human connection and the necessity to manage public and private information has increased in signal strength as this revolution changes society.

With virtually unlimited possibilities business model shifts are occurring across every industry. How we work and communicate is undergoing a profound paradigm shift – asking every member of society to rethink their values and goals.

CIOs step inside the mind of a security hacker

Security is not a one-time event. Ensuring threat resilience is your duty as a CIO. Reevaluate your company’s security situation often – everyone’s future depends on it.

Does your board question the company’s approach to security? It should. Understanding your opponents can help break the cyber kill chain. Think like your opponent in this multi-player game. Begin with a framework that covers policies, standards, guidelines, and procedures to ensure consistency – earn trust.

Medical and healthcare breaches

The cost of a data breach is increasing The Ponemon Institute partners with IBM to produce the 2016 Cost of Data Breach Study: The Impact of Business Continuity Management (BCM) report. The report analyzed 383 companies, in 12 different countries, across 16 industries including healthcare.  The average cost of unauthorized data access is between USD $149 to USD $167 per record, with the total cost of a data breach ranging from USD $3.7 million to USD $4.29 million. In the medical and healthcare category since January 2016, there have been 158 reported data breaches, according to theJune ITRC Breach Report. The Identity Theft Resource Center publishes the ITRC Breach database that is updated on a daily basis and posted to the ITRC website every Tuesday.

Healthcare accounted for 33.4 percent of total breaches in 2016 (as of June 14, 2016), with a reported total of 4.3 million records breached. Healthcare organizations from California to Florida reported data breaches due to unauthorized access including Florida Hospital Medical Group, OptumRx, CVS Alabama Pharmacy, Kaiser Permanente – Inland Empire, MedStar Health, CareCentrix, Blue Shield of California, and Integrated Health Solutions / Bizmatics. If these well-known organizations can be breached, who’s safe? How do organizations protect themselves?

It’s worth noting that 36 of the medical and healthcare companies that experienced a data breach have not reported the exact count of records breached. CIOs must understand their opponents, break the cyber kill chain, and leverage frameworks that are proven to proactively address the threat of data breaches.

Understand your opponents

Understanding who your organization is playing against is everything. Sun Tzu a military strategist lived in ancient China. He was an active as a general and strategist, serving King Helü of Wu in the late sixth century BC, beginning around 512 BC. Clear definition of your opponents is essential. Sun Tzu in The Art of War stated, “if you know the enemy and know yourself; you need not fear the result of a hundred battles. If you know yourself but not the enemy, for every victory gained you will also suffer a defeat. If you know neither the enemy nor yourself, you will succumb in every battle.”

Security is not a single person game. It’s an aggressive team sport, with only one universal rule – don’t lose. Your team is playing a multi-player game that is designed to test your strategy, tactics, and resilience. This game isn’t local. The playing field is global, and the players plug in and out of active status seamlessly. Oh, and cheating is allowed.

Break the kill chain

Lockheed Martin adapted the concept of a kill chain to information security, conventionally a military concept related to the structure of an attack.

The military kill chain model has four core phases, 1. Target identification, 2. Force dispatch to target, 3. Decision and order to attack the target, and 4. Destruction of the target.

The adapted Lockheed Marting Cyber Kill Chain has seven core phases.

1. Reconnaissance

2. Weaponization

3. Delivery

4. Exploitation

5. Installation

6. Command and Control

7. Actions on Objective

Understanding how a cyber attacker penetrates your corporate security, will help you defend against threats. Remember that during each step in the Cyber Kill Chain you have five courses of action:

1. Detect

2. Deny

3. Disrupt

4. Degrade

5. Deceive

6. Contain

The paper titled Intelligence-Driven Computer Network Defense Informed by Analysis of Adversary Campaigns and Intrusion Kill Chains was published by Lockheed Martin effectively outlining the Lockheed Martin Cyber Kill Chain. The paper states, “that a kill chain is a systematic process to target and engage an adversary to create desired effects.” Your optimal course of action will depend on the phase of the kill chain in play.

Think like your opponents

The first step in the Cyber Kill Chain is reconnaissance, where an attacker is studying your company’s behavior (harvesting email addresses, social networking, passive search, IPs, and port scanners). The attacker is selecting their targets. The second phase is weaponization. At this phase, the objective is to create an exploit (develop exploit with payload creation, malware, delivery system, or decoys). Nothing has been deployed into the environment at this point, but the exploit is typically employing an automated tool (weaponier). The third phase is delivery (spear phishing, infected websites, service provider, USB). During this phase, the weaponized system is delivered to the targeted environment. The fourth phase is exploitation (exposing a vulnerability to execute code on the target system e.g. malware or ransomware). The fifth phase is installation (installing the malware on the asset and typically this allows the adversary to maintain persistence inside the environment). The sixth phase is command and control (establishing a channel to enable “hands on the keyboard” access inside the target environment). The target system at this point can be remotely manipulated within the target environment. The last and seventh phase of the Cyber Kill Chain is the actions and objectives phase (take action to achieve desired objective e.g. data exfiltration involving collecting, encrypting, and extracting information from the target environment).

The ability to categorize the threat into a phase of the Cyber Kill Chain is vital to ensure the correct course of action. All too often policies and standards are established that sit on a shelf and are not updated (we’ve all seen this). A perfect example is asking for a security policy, and when you finally do receive it, the last update was stamped two years ago.

Begin with a framework

You’ve been inside the mind of an attacker. You’re armed with the knowledge of the Cyber Kill Chain and the courses of action you can leverage to protect your organization. It’s now time to start your security assessment. Where do you start?

Whether you’ve experienced a threat, breach, or are proactively anticipating disruptions, every approach begins by selecting a security framework. There are many security frameworks to choose from includingISO/IEC 27000-series, COBIT 5 for Information Security, and NIST SP 800 Series. The benefit of using a security framework is that it offers a common language to standardize the approach for addressing threat concerns.

Policies, standards, guidelines, and procedures

Now that we understand how an attacker thinks, we’ll explore the categories of threats and the broad approaches to address security in your organization.

According to the April 2016 Internet Security Threat Report by Symantec, there are six board categories of threats and your security approach should address them all: 1. Mobile devices and Internet of things, 2. Web threat, 3. Social media, scans, and email threats, 4. Targeted attacks, 5. Data breaches and privacy, and 6. Cloud and infrastructure.

Armed with the categories of threats, we can focus on the four steps that provide the foundation for your security program:

1. Policies – high-level standards

2. Standards – low-level mandatory controls

3. Guidelines – recommended, non-mandatory controls

4. Procedures – step-by-step instructions to assist actors in implementing the various policies, standards, and guidelines

Every good security program has these four primary components. The policy is the broad organizational governing document that addresses a facet of your security program. A simple example could be a password protection policy. This password protection policy would include an overview, purpose, scope, policy, policy compliance, related standards or policies or processes, and definitions and terms. The password standard would set board rules for password complexity. Guidelines are a collection of recommendations, non-mandatory controls that help support standards or provide a reference when no applicable standard exists. The standard would reference procedures, for example, the password protection policy. These procedures would include a password creation section on user-level and system-level passwords conformance and reference the password construction guidelines.

Be consistent

When communicating to the board be consistent and use three sound approaches for your security foundation, when dealing with these challenging discussions: prevention, protection, and resilience.

You’ll be able to handle every discussion on security and guide that discussion into one of the three areas: where your organization is already focusing. One effective “security approach is to prevent a threat from arising in the first place, especially by addressing its underlying causes. When the threat cannot be prevented, security as protection aims to defend against, if not eliminate, the threat. But if we cannot fully protect ourselves from the threat, security as resilience considers our ability to “bounce back” and alter the ways in which it affects our social systems — our ability to adapt to threats that strike us.” The Centre for Security Governance article, Three Approaches to Security helps to remind us, that a layer security approach is a time-proven method for protection.

A layered security approach will help maintain the trust of your leadership teams. Don’t build new processes, leverage existing processes proven to work.

As Sun Tzu had said, “the greatest victory is that which requires no battle.”

 

Peter B. Nichol, empowers organizations to think different for different results. You can follow Peter on Twitter or his blog Leaders Need Pancakes. Peter can be reached at pnichol [dot] spamarrest.com.

Healthcare interoperability research propositions of the ONC blockchain challenge

One paper moves past theoretic results to practical research domains exploring the autonomous monitoring of ubiquitous medical devices using blockchain technologies.

The National Institute of Standards and Technology (NIST)in partnership with the Office of the National Coordinator for Health Information Technology (ONC), requested healthcare and technology leaders submit research papers to explore the “use of blockchain in Health IT and Health-released Research.

One submission titled, Co-Creation of Trust for Healthcare: The Cryptocitizen Framework for Interoperability with Blockchain, in particular addresses future research propositions within the paper.  The co-authors submit that there are three primary areas in need of further research including the monitoring of medical devices.

The paper was authored by Peter Nichol and Jeff Brandt. Peter Nichol is an expert in digital, innovation, and healthcare. Jeff Brandt is an expert in mobility, security, and healthcare. Together they co-authored a paper for the ONC Blockchain Challenge titled, Co-Creation of Trust for Healthcare: The Cryptocitizen Framework for Interoperability with Blockchain.

Autonomous monitoring of ubiquitous medical devices

The immutability of blockchain can improve access to medical information. How will care change over the next three to ten years? Will the definition of treatment change? Today, when we think of preventative thoughts of a physical doctor appointment rise to the top of mind. Tomorrow robotics, nanobots, and nanomachines may be a common part of the new definition of preventative.  The co-authors present the concept of blockchain being leveraged for healthcare device maintenance, where nanomachines will autonomously communication device-to-device.

“Device-to-device distributed sharing will create a new market for semi-autonomous devices. These devices – such as delivery robots providing medical goods throughout a hospital autonomously or disinfection robots that interact with people with known infectious diseases such as healthcare-associated infections or HAIs – will be reporting information not to a central authority but to other devices. Medical nanotechnology is expected to employ nanorobots that will be injected into the patient to perform work at a cellular level. Ingestibles and internables bring forward the introduction of broadband-enabled digital tools that are eaten and “smart” pills that use wireless technology to help monitor internal reactions to medications. Medical nanotechnology is just the beginning.” 

Monitoring Afib with the blockchain

The research proposition adds color to the how medical device maintenance is possible with blockchain with an example.

“The following is an example of how blockchain technologies could manage medical devices. A patient named John, with atrial fibrillation, is having an atrial defibrillation device implanted: commonly known as an Afib device. This implantable defibrillator allows quick restoration of the sinus rhythm by administering a low-energy shock. The Afib device was manufactured by company “X” with a serial number “Y.” During manufacturing, a blockchain was created to track this device. The US Food and Drug Administration (FDA) mandated that a hash of the unique device identifier (UDI) be stored in the blockchain along with other pertinent information. The hash of the device information is stored and verifiable in an immutable digital ledger. The implanted Afib device is assigned to John (patient), and the device’s blockchain is updated with information such as the hospital, doctor, emergency contacts, and advance directives around care for patient John. The Afib device is supported by a series of smart contracts that can autonomously notify John (patient) and providers when the device needs service, e.g., battery expiration, or when health irregularities are detected.

Today, device preventive maintenance is rudimentary at best. For example, when an Afib device requires maintenance, the device starts to audibly alarm Jon (patient), [in his chest] which can be disturbing. A smart contract could also send preventative maintenance information to the patient and provider, reducing the chance of a catastrophic failure.”

The two authors Peter Nichol and Jeff Brandt, present compelling arguments offering new exploratory propositions that require more research. The research propositions identify three areas that required more in-depth research including:

  • Proposition 1: Healthcare Device Maintenance Ranging from Medical Devices to Nanomachines Will Autonomously Communicate Device-To-Device.
  • Proposition 2: Personal and Public Self-sovereign Will Place Identity Ownership in the Hands of the Patient.
  • Proposition 3: Electronic Health Information Exchanges (HIE) and All-Payer Claims Databases (APCD) Will Establish Trust Using Blockchain Technologies.

Blockchain technologies do offer answers to healthcare’s interoperability struggle. The solution to this national crisis won’t, however, be purely solved by adding yet another layer of technology. The underlying broken payer and provider processes need to be addressed in parallel for a complete patient-centric solution.

The patient ownership of data accelerates health data transparency. Blockchain technologies can observe what data is being accessed, who can access the data, and for what period. Self-sovereign identity provides sovereignty, security, and privacy to promote benefits for the patient and the organization or agency by reducing risk, strengthening security, improving accuracy, deepening permission control, and decreasing the time required for regulatory oversight.

The paper titled, Co-Creation of Trust for Healthcare: The Cryptocitizen Framework for Interoperability with Blockchain, can be downloaded from this location. The ONC Blockchain Challenge will publish the results of the competition on August 29, 2016. Blockchain technologies can spark the co-creation of trust in healthcare. 

Download the FULL paper Here: Co-Creation of Trust for Healthcare: The Cryptocitizen Framework for Interoperability with Blockchain.

Robotic process automation for healthcare

Autonomics and multi-agent systems will be applied in healthcare to definable, repeatable, and rule-based processes. Robotic process automation will be a competitive advantage, not replacing humans but enabling them.

Autonomics ultimately aims to develop computer systems capable of self-management and was started by IBM in 2001. These self-regulating autonomic components are driving the research of multi-agent systems (MAS). MAS are computerized systems composed of multiple interacting intelligent agents within an environment. Robotic process automation (RPA) is capable of automating activities (by creating software agents) that once required human judgment. This is the evolution of automation: the automation of automation.

Transactional to analytical

In 1990, traditional onshore labor was the norm. By 2000 offshore labor was ripping through every industry including healthcare. Huge cost savings were realized shifting from the traditional onshore model to an offshore model. The next revolution of digital labor is called “no shore.” This robotic process automation is autonomic, self-learning, and self-healing system.

The Institute for Robotic Process Automation (IRPA) published an excellent report highlighting the top ten benefits of robotic process automation that cross industries.

1. Decreased operational costs – no shore models (digital software agents)

2. Improved data analytics – task executed by robots allow for analysis

3. Increased regulatory compliance – steps are tracked, traceable, and documented

4. Increased efficiency – software robots never need time off

5. Higher employee productivity – software agents address repetitive activities, freeing workers to participate in more value-added activities

6. Improved accuracy – employees are human, and all humans make mistakes

7. Increased customer satisfaction – decreased errors build deeper customer relationships, improving retention and customer happiness

8. IT support and management – it’s easier to scale software than it is people

9. Logistical upside – minimize or eliminate complications with offshore labor

10. RPA and business processors – presentation-layer automation software, mimicking the steps of rules-based non-subjective processes

Automation process cycle

When do labor efficiencies become labor elimination? To better understand how RPA can enable your organization we first need to identify the five phases of the automation process cycle:

1. Manual execution – one off, no repeatable processes

2. Scripting – linear tasks, standard and repeatable

3. Orchestration – activities that are complex, standard, and multi-scripted

4. Autonomics – dynamic processes that are non-standard, contextual, and inference based

5. Cognitive – self-aware systems, that are predictive, self-learning, and self-healing

If we want our employees engaged in activities that involve personal interactions, problem-solving, and decision-making we need first to get them out of the tedious and repetitive activities.

What if you were told there will be a new team member joining your team. You’re not sure where they are geographically located, but you managed to get some intel from your colleague. You are told they never complain, didn’t want a desk, never need coaching, and love daily performance reviews. This is the resume of the modern robot, a leader in process automation. The competition just got stiffer.

Multi-agent systems

Robotic process automation begins with an understanding of agents. Typically, multi-agent systems refer to software agents, but these systems could equally be robots or hybrid robot and human teams.

There are three primary types of agents: passive agents (simple – agents without goals), active agents (advanced – agents with simple goals, and cognitive agendas (complex – with complex calculations and activities). Agent environments where these types of agents reside can be divided into three environments: virtual environment, discrete environment, and the continuous environment. Also, each agent environment has one or more associative properties:

1. Accessibility – when possible to gather complete information about the environment

2. Determinism – if an action performed in the environment causes a definite effect

3. Dynamics – how many entities influence the environment at the moment

4. Discreteness – whether the number of possible actions in the environment is finite

5. Episodicity – whether agent actions in certain time periods influence other periods

6. Dimensionality – whether spatial characteristics are important factors of the environment and the agent considers space in its decision-making

RPA applied to healthcare

Transparency Market Research, predicts that the global IT robotic automation market to be worth USD $4.98 billion by 2020. Robotic automation is a powerful alternative to offshore outsourcing. It is curious how these processes managed to escape automation. Regardless, there are many areas where RPA can be applied to healthcare including account management, claims processing, underwriter support, customer support, billing, collections, reconciliation, and reporting and analytics consolidation.

The HfS Blueprint Report helps us identify precisely where RPA can be applied within the healthcare ecosystem.

1. Claims administration – claims adjudication and processing, payment integrity complaints, and appeals

2. Member management – account setup, eligibility, and enrollment, billing, benefit management, and customer service

3. Provider management – provider credentialing, provider data management, contracting audits, and network management

4. Health & care management – population health and wellness, utilization management, care coordination and case management, and remote monitoring

5. Administration – finance, accounting, and training

Intelligent automation is entering the business world, and CFOs are happy because RPA is delivering the promised cost savings. However, cost-only value propositions are no longer attractive to top executives. They are looking for cost-plus value propositions (transactional plus judgement-intensive plus analytics). Global labor arbitrage, the disintegration of barriers to international trade or moving to where costs of doing business are low, is no longer sufficient. In this quest for greater cost-plus value propositions, technology plays a critical role.

Start by getting to understand where repetitive task hurt your organization. First, identify the opportunity, second validate the opportunity, third design the mode, and fourth deploy a pilot. Health plans and providers are discovering software agents as a cost-effective alternative to enhancing or replacing platforms.

The conversation has expanded beyond cost reduction to quality, engagement, and innovation. This new phase of sourcing will engage and manage resources to shift workers from the mundane task to activities with deeper customer interactions.

Health innovators are using robotics process automation to drive the next stage of transformation – at affordable costs. Robotic process automation isn’t coming soon; it’s here.

Telehealth and the levers that will move the healthcare industry

Telehealth changes how care is provided at the state and national level. Telehealth policy is a determining success factor. Thinking of rolling out a telehealth program? Providing telehealth challenges providers.

There has been a lot of progress in telehealth over the last three years. In 2013 there were only 13 states that were cleared for consultation and prescribing and three states restricted consultation in the absence of a prior in-person relationship. According to American Well, a telemedicine technology solutions company by January 2016 most states had been cleared to consult and prescribe, with various exceptions in Alaska, Louisiana, and Indiana. Inconsistent state definitions create challenges for national providers. Clinical permissibility, licensure, and reimbursement remain the flagship challenges.

Inconsistent policies

It’s often said that the only thing consistent about telehealth is that it’s inconsistent.

The Model Policy for the Appropriate Use of Telemedicine Technologies in the Practice of Medicine was reported by the State Medical Boards’ Appropriate Regulation of Telemedicine (SMART) workgroup and later was adopted as policy by the Federation of State Medical Boards (FSMB) in April 2014. This new policy guided regulation of state medical boards in the use of telemedicine technologies in the practice of medicine and educates licensees as to the appropriate standards of care in the delivery of medical services directly to patients via telemedicine technologies. This policy blazed the path for telemedicine adoption by superseding the Model Guidelines for the Appropriate Use of the Internet in Medical Practices previously adopted April 2002.l 2002.

Clinical permissibility

Clinical permissibility comes down to whether or not providers can deliver care via telehealth. Mainly this discussion swirls around telehealth policy. However, the Medical Boards have made significant strides to get the policies right. Informed consent, the evidence documenting appropriate patient informed consent for the use of telemedicine technologies must be obtained and maintained. Informed consent is a primary consideration with telehealth policy and includes:

1. Identification of the patient and physician (physical credentials),

2. Types of transmissions permitted using telemedicine technologies (prescription refills, appointment scheduling, patient education),

3. Patient agreement (that it’s the physician’s decision as to whether a telemedicine encounter is appropriate),

4. Adequate security measures (data, passwords, files, identification and authentication techniques),

5. Hold harmless clause (if due to technical failures information is lost), and

6. The requirement for express consent to forward patient-identifiable information to a third party (administration, billing, care).

These six factors ensure that informed consent is appropriate. However, even after informed consent is secured other clinical issues surface. Can the provider establish a treatment relationship sufficient to prescribe using telehealth? Must a prior relationship have been established? Is this the same standard of care as a facility visit to a provider? Does this encounter include a prescription for controlled substances or does this encounter trigger a limited formulary? Each of these questions needs to be addressed and communicated to the patient, to ensure the patient understands whether of not a prior examination is required before care is administered.

These questions also weigh on the minds of the state medical and pharmacy boards and national organizations such as the American Medical Association and Federation of State Medical Boards.

Reimbursement and licensure

State legislation defines the telemedicine reimbursement models for commercial and Medicaid reimbursements. Understanding how providers expect to be paid for telehealth is essential.

Credentialing and privileging are the same challenges providers face with facility-based care models. Providers are pressured to offer a large number of health plans across a diverse network of providers. Providers must also select privileged practitioners who can provide credentialed care. It’s tough for providers to keep up with multiple state licensure for clinicians that are decentralized.

Anticipating this licensure, the Federation of State Medical Boards issued the Interstate Medical Licensure Compact Legislation. According to the FSMB, the Interstate Medical Licensure Compact offers an expedited licensing process for physicians interested in practicing medicine in multiple states. The Compact is expected to expand access to health care, especially to those in rural and underserved areas of the country, and facilitate the use of telemedicine technologies in the delivery of health care. The Compact legislation to expand access to healthcare by expediting medical licensure has been adopted by 16 states including Kansas, Mississippi, Alabama, Arizona, Idaho, Illinois, Iowa, Minnesota, Montana, Nevada, New Hampshire, South Dakota, Utah, West Virginia, Wisconsin and Wyoming.

For states without compact legislation provider complexity is magnified. While NCQA and URAC accreditation help to ensure provider quality, they don’t do much to ensure multi-state licensure interoperability.

Evidence of progress

There are 29 states including Washington D.C. that have mandated commercial reimbursement for telehealth, as of mid-2016. Several states subscribe to parity mandates, which are a form of commercial mandates that require services be paid to the same extent and at the same level as in-person services (NV, MT, MN, CO, MS, LA, ME, DE, and CT). More cautious states, offer commercial reimbursement with limitations or restrictions, that mandate coverage for commercially provided telehealth services but contain limitations and site restrictions.

As patients demand to be the CEO of their health, the healthcare ecosystem will need to work together to tackle clinical permissibility, licensure, and reimbursement before telehealth goes mainstream.

mHealth and telehealth flight for inclusion

Telehealth is one of the fastest growing markets, and broad adoption is building from modular telemedicine units to teleaudiology. Not all providers are finding transitioning skills easy.

The global telemedicine market is projected to grow to $66.6 billion by 2021, at an estimated CAGR of 18.8 percent during the forecast period 2016 to 2021, according to Mordor Intelligence. While this seems significant, it’s a fraction of the market, especially considering that U.S. health care expenditures were estimated to be $3.24 trillion in 2015, and forecasted to increase to $3.78 trillion by 2018, reported by Forbes.

The adoption curve

Gartner published an excellent overview in mid-2015 that covered the hype cycle for telemedicine and virtual care. TheGartner Hype Cycle has five phases: 1. Technology trigger (tech breakthrough), 2. The peak of Inflated Expectations (media over hypes the technology), 3. A Trough of Disillusionment (pilots fail to deliver and interest declines), 4. The slope of Enlightenment (value is apparent as pilots are refunded), and 5. Plateau of Productivity (mainstream adoption becomes a reality).

The definition of telemedicine appears straight forward. Telemedicine is the use of telecommunication and information technologies to provide clinical health care at a distance. It helps eliminate distance barriers and can improve access to medical services that would often not be consistently available in distant rural communities. What does that mean? Healthcare institutions are interpreting clinical care in different ways. Executives do have options to understand better how other practitioners are approaching telehealth. The mHealth + Telehealth World conference held in Boston July 25-26, 2016 will tackle the role of technology in health and help executives to understand the impact of the connected world on the future of healthcare.

Telehealth clinical applications

Telehealth is used in various areas to improve clinical care across the healthcare industry. The Gartner hype report presented evolving concepts to help CIOs define business strategies and prioritize their investments.

1. On the Rise – Digital Telepathology, Patient Decision Aids, Telepsychiatry, Telesurgery, EHR Support of Virtual Care, and Teleaudiology.

2. At the Peak – Modular Telemedicine Units, Healthcare-Assistive Robots, Medication Compliance, Management, Quantified Self, Wearables, Real-Time Virtual Visits, and Teletrauma.

3. Sliding into the Trough – Continua, Personal Health Management Tools, Telepharmacy, Mobile Health Monitoring, Video Visits, and Teleretinal Imaging.

4. Climbing the Slope – Home Health Monitoring, Telestroke, Teledermatology, Remote Electrocardiogram Monitoring, and Patient Portals.

5. Entering the Plateau – E-Visits and Remote ICU.

Applied perspectives for better health

The best approach for telehealth will depend on the provider practice. However, in almost all cases it starts with a champion who is a clinical practitioner who thoroughly comprehends the reimbursement cycle.

Once the champion is on board start with a pilot that will have depth. Two classic business cases where telehealth can be beneficial are chronic disease management and episodic care management or continuous care. Chronic illness management can shift a significant portion of facility care to telehealth. This is, however after face-to-face visits resulted in a relationship between the physician and the patient. The is nothing more important than trust in healthcare, and it’s built best in person. Episodic care management is less frequent and therefore makes a good case for telehealth replacement. MedStar Health operates more than 120 entities with over ten hospitals in the Baltimore area. Medstar Health started to offer telehealth first to employees through a pilot during 2015. This pilot expanded in 2016 to encompass those they insure. The last step is to beta telehealth in public.  

The hill yet to climb

We will get past the technical issues with mobile coverage and poor connectivity. Let’s also assume that the current fee-for-service (FFS) models are replaced with value-based payment models (shared savings, bundles, shared risk, and global capitation).

MedStar experienced three main challenges: scheduling across states (given providers are state-licensed), equipment setup, and administrative policies (coding online visits).

We all want to reach the tipping point where societal perceptions of healthcare include telehealth. This collective benefit will impact the cost of care, quality of care, and access to care. This leads us to one question: are providers ready to transition their skills?

There is a significant difference between bedside manner and webside manner. Doctors must combine their medical knowledge with technical knowledge. How will doctors be trained for this new environment? Computer-based training (CBT) is out. Train-the-trainer is also ruled out. We’re left with a training approach that requires providers to train-by-doing. To provide telehealth, you must practice providing telehealth. This means bringing telemedicine into the academic setting to prepare residents of the challenges early and educate on the huge societal benefit of remote care.

Compensation, administration (billing), and technology still have issues to work out. We’ll get there, but before the patient can be placed in charge of their health, they need to be educated. This education will require more provider time, not less. Telehealth offers an alternative to providing education to patients without a physical visit allowing providers to explain medication, images, and diagnoses more thoroughly.

Telehealth has the potential to increase patient satisfaction and trust. The success of telehealth begins with everyone starting from the same step – getting educated.