Recharge your knowledge of the modern data warehouse

Data warehousing is evolving from centralized repositories to logical data warehouses leveraging data virtualization and distributed processing. Make sure you’re not using old terminology to explain new initiatives.

Are you comfortable with source systems feeding ETL processes into operational data stores or master reference data through an enterprise service bus with the product, supply chain and business operational reports dumped into a presentation layer with soft analytics, dashboards, alerts and scorecards? That was yesterday.

Don’t get caught, explaining your new data warehouse initiative with old terminology.

Traditional vs. modern data warehouses

Data warehouses are not designed for transaction processing. Modern data warehouses are structured for analysis. In data architecture Version 1.0, a traditional transactional database was funneled into a database that was provided to sales. In data architecture Version 1.1, a second analytical database was added before data went to sales, with massively parallel processing and a shared-nothing architecture. The challenge was that this resulted in slow writes and fast reads. In data architecture Version 2.0, the transactional database populated a second database which flowed into a third analytical database, which connected to the presentation layer (business intelligence). In data architecture Version 2.1, multiple transactional databases fed the core database which provided information downstream to data stores (sales, marketing, finance) that connected to a business intelligence engine. At this point, traditional database structures end and modern structures begin: data architecture Version 3.0.

The two below examples highlight the difference between a traditional data warehouse and a data a modern data warehouse (using Hadoop for this example).

Traditional data warehouse:

  1. Operational systems: CRM, ERM, financial, billing.
  2. ETL: decision analysis model and data.
  3. Enterprise data warehouse: operational, customers and IT data marts.
  4. BI platform: KPI summary, monthly and quarterly reporting and daily summaries.
  5. Automatic customer value analysis: interactive data queries, static data analysis, and OLAP.
  6. BI collaboration portal: wholesale, OEM, sales, employees and external.

Modern data warehouse:

  1. HDFS: Hadoop Distributed File System.
  2. HCatalog: a metadata and table and storage management layer system.
  3. HBase: key-value database, columnar storage.
  4. MapReduce: a flexible parallel data processing framework for large data sets.
  5. Oozie: A MapReduce job scheduler.
  6. ZooKeeper: distributed hierarchical key-value store enabling synchronization across a cluster.
  7. Hadoop: open-source software framework to support data-intensive distributed applications (storage, processing of big data sets).
  8. Hive: A high-level language built on top of MapReduce for analyzing large data sets.
  9. Pig: Enables the analysis of large data sets using Pig Latin. Pig Latin is a high-level language compiled into MapReduce for parallel data processing.

Most database designs cover four functions: 1) data sources, 2) infrastructure, 3) applications and 4) analytics. This principle of design does apply to both traditional data warehouses and modern architectures. The design thinking, however, is different. In a modern data warehouse, there are four core functions: 1) object storage, 2) table storage, 3) computation and processing, and 4) programming languages.

Re-establish the structure for success

The lack of data governance, inadequately trained staff, weak security and non-existent business cases each factor into why data warehouse or business intelligence initiatives fail to achieve the desired outcomes. Keep your data warehouse program on track.

Start by strengthening your framework for business intelligence. If it’s been more than six months since you looked at your end-to-end operational state, it’s a good idea to revisit the original thinking and revalidate assumptions.

The three-tier structure outlined here can help guide your discussions and the assessment.

Primary functions

  1. Program management: portfolio, process, quality, change and services management.
  2. Business requirements: business management, information services, communities, capabilities, service levels and value.
  3. Development: data warehouse and database, services, data integration, systems, monitoring systems and business analytics.
  4. Business value: culture; data quality, analytics and data utilization; data evolution; and value measurements.

Secondary functions

  1. Intelligence architecture: data, integration, information, technology and organizational.
  2. Data governance: accountabilities, roles, people, processes, resources and outcomes
  3. Operations: data center operations (SaaS, DaaS, PaaS, IaaS), technology operations, application support and services delivery
  4. Intelligence applications: strategic intelligence, customer intelligence, financial intelligence, risk intelligence, operations intelligence and workforce intelligence

Tertiary functions

  1. Data integration: data consolidation, data quality and master data management.
  2. Informational resources: data management, informational access and metadata management.
  3. Informational delivery: query and reporting, monitoring and management, and business analytics.

Shrinking budgets, pressure to deliver and expanding data sources all encourage us as CIOs to accelerate progress. Much of this acceleration comes at the cost of not thinking. The modern data warehouse is being designed differently. This means we as leaders need a block of time to think. This time also allows us to upgrade our understanding of how modern data warehouses are planned, refresh the core elements of the progressive data ecosystem and upgrade our terminology.

Evaluating your current data warehouse initiative

Start by asking the following questions to determine if you’re running a modern data warehouse.

  1. Does our environment quickly handle diverse data sources and a variety of subject areas?
  2. Can we handle excessive volumes of data (social, sensor, transactional, operational, analytical)?
  3. Are we using structures such as data lakes, Hadoop and NoSQL databases, or are we running relational data mart structures?
  4. Do we support a multiplatform architecture to maximize scalability and performance?
  5. Do we utilize Lambda architecture (more about data processing than data storage) for near real-time analysis of high-velocity data?
  6. Have we leveraged new capabilities like data virtualization (cloud services) in additional to data integration?
  7. Has the organization applied data warehouse automated orchestration for improved agility, consistency and speed through the release life cycle?
  8. Is our organization running a bimodal business intelligence environment?
  9. If we asked our primary business sponsors, would they know where the data catalog is located to document business terminology?
  10. Are the BI development tools decoupled from the agile deployment models?
  11. Have we clearly defined how we certify enterprise BI and analytical environments?

Modern data warehouses are comprised of multiple platforms impervious to users. Polyglot persistence encourages the most suitable data storage technology based on your data. This “best-fit engineering” aligns multi-structure data into data lakes and considers NoSQL solutions for XML or JSON formats. Pursuing a polyglot persistence data strategy benefits from virtualization and takes advantage of the diverse infrastructure.

If you’re well into the modern data warehouse journey but have not seen the benefits initially forecasted, don’t fear, there is still hope. Allow me to share a few tips to uncover the underlying challenges preventing successful adoption. First, define all the data storage and compression formats in use today. There are many options, and each one offers benefits depending on the type of applications your organization is running. Second, look at the degree of multi-tenacy supported in your BI environment. Using a single instance of software to serve multiple customers improves cost savings, makes upgrades easy and simplifies customizations. Third, review the schema or schema-less nature of your databases and the data you’re storing. Understanding how data is loaded, processed and analyzed can help to determine how to optimize the schemas of objects stored in systems. Fourth, metadata management, while often overlooked, can be almost more important as the data itself.

Upgrading your team’s understanding of data warehouses will move your organization toward agile deliveries, measured in weeks not months. 

The quiet revolution: the internet of data structures with IPFS

Everything of value involves data. Where data lives and how it’s accessed is about to change. The internet of data structures (IoDS) is transforming the web from linking data by location to linking data with hashes. Is your organization prepared?

The internet of data structures (IoDS) is emerging as one of the most significant advancements in data within the last decade.

HTTP (the hypertext transfer protocol) is the foundation communication the World Wide Web. Hypertext is structured text that enables us to access content throughout the web, using logical links (hyperlinks) between nodes containing data. But what if HTTP was no longer needed? What if there was a better way to communicate and connect data. IPFS (interplanetary file system) is a globally distributed storage system. Content is addressable and shared through a peer-to-peer hypermedia distribution protocol.

URLs are out; hashes are in.

The movement from HTTP to IPFS

Previously, I described the IPFS storage model and the benefits for healthcare. Today, we’ll step a layer deeper into how the structure is designed and I’ll offer an introduction to the IPFS stack.

Where does IPFS fit into our existing infrastructure? How do you communicate the value of this technology and the application of business transformation? Those questions are exactly what we’ll be tackling.

HTTP uses hyperlinks that translate into locations to connect discrete objects and data sets. IPFS is like HTTP, but instead of using locations provided by a group of servers, IPFS uses a peer-to-peer network to share context using hash values or hashes. In IPFS, content is addressable using hashes, the hashed value of the content.

IPFS is a Merkle addressed transport protocol for distributed data structures. The IPFS stack breaks down into three general buckets, each offering particular value.

  1. Using the data: applications (the IPFS stack)
  2. Defining the data: naming, Merkle-DAG (IPNS, IPLD)
  3. Moving the data: exchanges, routing, network (Libp2p)

These three primary buckets further divide into five broad categories that compose the infrastructure stack.

  1. Applications: EtherpadVLCGitEthereumWhisper
  2. Naming: DNSIPNSEthNamesNamecoin or IPLD
  3. Exchange: BitTorrentBitswapFTPHTTP
  4. RoutingGossipChordKad DHTmDNSDelegatedI2PTOR
  5. NetworkCJDNSUDTuTPWebRTCQUICTCPWebSocketsI2PTOR

Accessing files on IPFS

It’s easier to understand IPFS if we frame it next to concepts we’re already familiar with, like DNS.

The below HTTP example shows a typical website URL for a company logo and the host name translated into an IP address using DNS. Next, the IPFS example offers a comparative example using the IPNS (interplanetary naming system) and IPFS working together. IPNS allows the storage of a reference to an IPFS hash under the namespace of your peerID (the hash of your public key). This IPFS hash references an addressable object in IPFS, using a hash value that points to a hash object linked to another hash object until your destination is found.

IPFS also achieves immutability by separating key management from file system security. The filenames contain public keys making them self-certifying pathnames. Public key hashes, resolve pointers that are signed with a private key to access content.

HTTP

  • http://peterbnichol.com/linktohash/logo.jpeg (domain name service)
  • http://10.11.12.13/linktohash/logo.jpeg (IP address)

IPFS

  • /ipns/ReE45fRer5LR3/linktohash/logo.jpeg (InterPlanetary name service – optional)
  • /ipfs/ReE78kGrd5KJ2/linktohash/logo.jpeg (hash address)

The process of linking by objects is similar to how inodes operate, except using hash values. An inode is a data structure on a file system that stores all the information about a file except its name and its actual data.

Posting content with IPFS

IPFS offers a unique approach to addressing and moving content within a network. If other peers were uninterested in your content, then the standard paid backup solutions (AWS, Azure, Swam) could be leveraged. Also, unlike other peer-to-peer distributed networks, IPFS only downloads explicitly required data. IPFS does not pull full copies of data.

Publishing content to IPFS is similar to publishing content through a private blockchain. It’s also possible to distribute content on IPFS and then remove yourself as a host who serves that content (remove the need for infrastructure?). Here is an example of posting data in an IPFS world:

  1. Create content
  2. Generate key names
  3. Sign content
  4. Distribute to peer-to-peer network
  5. Register key name and point to hash of public key

In theory, this process removes the need for locally owned and managed infrastructure. In practice, standard paid backup services may be required such as those listed above.

What will be impacted?

Any products, services or interactions that leverage storage or save data have the potential to be affected. This foundational technology layer will affect where data is stored (traditional databases to IPFS) and how data is accessed (URLs to hashes). Every platform that requires linked and encrypted communications has the potential to benefit from IPFS. Dapps and mobile applications will quietly shift to internet of data structures as scale and interoperability become increasingly critical.

Distributed denial-of-service (DDoS) attacks would be harder to execute on platforms running IPFS. HTTP routes static traffic to a central server, this decreases the attack surface, making targeted attacks more efficient. However, IPFS is a distributed storage system. By using IPFS and distributing the attack surface across peers, this makes conducting DDoS attacks significantly more difficult because content can be accessed through the distributed storage network.

Innovative leaders are learning about IPFS: what it is, where it impacts the organization and how it can be used to create a strategy to leverage this foundational technology.

Now available on Amazon! Click here!

A look at India’s biometrics identification system: digital APIs for a connected world

India’s digital infrastructure transformation is enabling new services and APIs. Explore how India’s digital citizen app works and how India is securing data and authenticating users.

The consistency of government-issued identification ensures connection to government benefits and financial services, limiting fraud, waste, and abuse. The lack of a national identification system has restricted access to public sector goods and services. Indians had struggled when, obtaining a driver’s license or even enabling a mobile phone, when they didn’t have identification. That has changed. India has emerged as a global leader in digital identification and has established the largest database of biometrics in the world.

Aadhaar: a national identification system

The classic debate between open free markets and imposed central control by government is the foundational quandary of democracy. A question that comes to the forefront is: have open markets offered a road to growth and prosperity or is a central operator a crucial element of the economic balance?

Regardless of the side of this debate upon which you fall, in both scenarios, a method for building an implied architecture for trust is required. A key tenet of setting the foundation for economic growth in a free and open market is to ensure citizen identity. Social healthcare, government entitlement programs, immigration, school admissions, and electoral voting all demand an identification system to minimize abuse of economic systems.

1.1 billion enrolled and growing

The Unique Identification Authority of India (UIDAI) is the world’s largest voluntary national identification number project, intended to cover 1.34 billion residents at project completion. Aadhaar is a voluntary, unique identification document program similar to the smart card program of France. Unlike the United States Social Security Numbers (SSN) program, the Chinese ID Card program, and the Chile ID Card – RUN program, UIDAI is not mandatory, but rather a voluntary government program.

The Aadhaar programs have made impressive strides towards a vision of providing all residents of Indian an identities. As of February 2017, 1.1 billion Aadhaar cards had been issued, an initiative that began in 2009. A struggle that started with the lofty goal of issuing 100 million cards within the 1st year, now has built a national system that can process 1.5 million applications a day.

Aadhaar is industrial grade and handles 15 million transactions daily.

The race to a unified payment system

The Unique Identification Authority of India created with the objective to issue Unique Identification numbers (UID), named as “Aadhaar.” Even before the first Aadhaar card was issued, in 2010, the UIDAI launched the Aadhaar Auth API. By 2011 the National Payments Corporation of India (NPCI) launched the Aadhaar Payments Bridge and the Aadhaar Enabled Payments System which uses the Aadhaar number as a central key for the electronically channelizing of government benefits and subsidies. Through 2012 the UIDAI, eKYC, a paperless Know Your Customer (KYC) process, allowed businesses to perform the “Know Your Customer” verification process digitally using Biometric or Mobile OTP. It is within the eKYC process that the identity and address of the subscriber are verified electronically through Aadhaar Authentication. The Aadhaar eKYC was initiated to reduce the delays with attestations and other verifications resulting in faster processing and issuance of digital certificates.

In 2015 the Government of India, Controlled of Certifying Authorities launched eSign as an open API. This new API enables an Aadhaar card holder to sign a document digitally. The Ministry of Electronics and Information Technology (MeitY) also launched DigiLocker, a platform for issuance and verification of documents and certificates in a digital way, thus eliminating the use of physical documents.

For reference-accessing a document available on DigiLocker has five simple steps:

  1. Sign up: register with mobile
  2. Sync: use your Aadhaar (card) to sync
  3. Request: get documents from issuers
  4. Share: exchange documents with requesters
  5. View: get documents verified by requesters

Today, DigiLocker has 4.5 million registered used with over 6.6 million uploaded documents, and 1.6 billion issued digital documents.

Lastly, in 2016 the National Payments Corporation of India enhanced their Unified Payments Interface, an advanced public payments system. This interface may be the backbone that catapults India into a cashless society by 2020.

India’s emerging API integrations

IndiaStack is a set of APIs that allows governments, businesses, startups and developers to utilize an India’s digital infrastructure. IndiaStack provides four technology layers: (1) presence-less, (2) paperless, (3) cashless, and (4) consent. Already some interesting APIs and products have emerged to leverage the IndiaStack suite of APIs.

There are several sandbox servers available to test functionality across the Aadhaar network. The Aadhaar Bridge Developer Kit enables developers to build apps with Aadhaar integration using one seamless platform. eMudhra eServices comprises of eSign and eAuth services that enable electronic paperless signing and authentication services. This service allows residents to sign any document electronically without going through the hassle of signing a document physically or with a dongle based digital signature – no physical signature required. AuthBridge offers an array of services like employee background check, risk assessment, and talent solutions. OnGrid is a consent-based trust platform that modernizes verification and background checks in India by linking an individual’s data, documents, verification reports, testimonials and references to the residents’ 12-digit Aadhaar number, for a faster and cleaner access to true identity and background. Aadhaar API provides authentication (anyone using their Aadhaar number and biometric, OTP, and demographic data instantly), eKYC (onboard anyone quickly by retrieving their proof of address and proof of identity using Aadhaar based KYC), and eSIGN (get contracts and documents digitally signed by your customers or business associates using Aadhaar). Digio is another document signing app that allows residents to sign agreements, IDs (driving licenses, PAN card, passports), approvals and letters, and self-attestations.

The Aadhaar API

How does India manage the vast amounts of data flowing through the identification system supporting 1.1 billion residents? There are a lot of articles addressing in general terms how the Indian identification systems work. However, I was hard pressed to find out any specifics about how authentication or data security and privacy was managed.

That was until I stumbled upon the Aahdaar Authentication API Specification. This specification outlines the how software professionals and vendors can in incorporate Aadhaar authentication into their applications. This is an API that will soon reach across India.In the following two sections I’ll explain how authentication and data security is managed by the Indian identification system.

Authentication and data security

Data blocks are encrypted with using an AES-256symmetric algorithm (AES/ECB/PKCS7Padding), and the session key is encrypted with 2048-bit UIDAI public key using an asymmetric algorithm (RSA/ECB/PKCS1Padding). Sessions keys are one-time only use and are not used across transactions. (There is a rather technical exception, but for the article’s continuity, we’ll omit that description. But you’re welcome to explore it if interested.)

The encryption flow also accepts multi-factor authentication using one-time pin (OTP) and could also be used in conjunction with biometrics.

Access to the Aadhaar Authentication API is categorized into two broad workflows:(1) registered devices and (2) non-registered devices. For this discussion, we’ll focus on the most likely access point, a registered device.

The encryption flow has nine steps which I’ll summary for brevity.

  1. Request key: Aadhaar Number, demographic, and biometric details enter into the application, and the Aadhaar authentication server sends out a one-time-pin to the resident’s registered phone
  2. Generate key: the application generates a one-time session key
  3. Encode data: an authentication “data” XML block is encrypted using the one-time session key and then encoded (base 64)
  4. Encrypt data:  the session key is encrypted with the UIDAI public key
  5. Send data: the encrypted block is sent along with a keyed-hash message authentication code (HMAC) to the authentication server
  6. Construct XML: the authentication server composes the XML for API input, including a license key
  7. Decrypt data: the authentication server decrypts the data with the UIDAI private key, then the data block is decrypted using the session key.
  8. Biometric factored: the residents biometric, demographics information and optional one-time-pin is factored in during the match
  9. Response confirmation: Aadhaar authentication server responds with a yes or no as part of the digitally signed response XML.

The introduction India’s national identity system establishes the foundation for an integrated government. In my most recent paper titled An e-Government Interoperability Framework to Reduce Waste, Fraud, and Abuse, I present a practical approach for how separate entities can share information within the United States through an e-Government Interoperability Framework. Interoperability is the key to a connected government, and India has taken the first step. A connected government is a government where the residents can access services such as healthcare, school admission, voting, and government programs seamlessly across government entities. With 1.1 billion issued Aadhaar cards racing towards 1.3 billion, it’s evident that the Aadhaar program is a vision that be realized. A program for the betterment of all Indian residents. Other countries, including the United States, would be well advised to pick up a lesson or two from the largest identification and biometrics program in the world.

Microservice ecosystems for healthcare

Patients will experience profound positive impacts as the advantages of microservices become better understood by healthcare technology leaders.

Google, Amazon and Soundcloud all have successfully deployed microservices. Let’s modernize healthcare applications with microservices.

Microservice architecture transfers healthcare providers and payers from one large application into smaller applications. These little applications or “micro” applications provide specialization using service-oriented architectures (SOA) by building dependent and flexible components. These micro pieces are not simple CRUD (create, read, update, delete) services — they have responsibilities.

Microservices combine lightweight mechanisms that offer scalability (Netflix supporting 800 different devices and 1 billion calls a day) and can support a range of platforms and interactions (the web, mobile, IoT, wearables).

The world of microservices

There are many reasons why microservices are valuable for healthcare. Before we jump into those reasons, let’s define the ecosystem that makes up the world of micro services.

A dynamic response to changing business conditions

Microservices provide agility and align well with changing business needs that require automation and the ability for functionality to be recomposed. The benefit of intrinsic interoperability with industrywide standards (HTTP and JSON) ensures that your technology is enabling your business to solidify your competitive advantage.

Microservices work off a three-layer system: system APIs (core business capabilities), process APIs (orchestration and choreography of components) and experience APIs (adaptable processes and configurable options). As patient engagement, sustainability, and outcomes prove ever more critical, the ability to micronize your healthcare environment will become a best practice in healthcare. The speed of delivery, accelerating innovation capabilities and new models of care today are prerequisites for a functional and efficient business operation.

Microservices for healthcare enable this vision.

Avoiding the snowball

Monolithic applications like the large electronic health record systems we know and love, eventually snowball into unreasonably large systems. The effect is that problems quickly snowball, out of control. Simple changes need to be made in multiple locations. Various systems across a healthcare ecosystem are running on different versions or service patients using entirely different and unconnected systems. Value is siloed.

What’s our solution to this problem? Our solution is that we build the functionality over and over again. We try to “reuse” components, but for the most part, they are constructed initially by vendors and then modernized — and that means another bill for similar work. Moving away from limited-reuse applications enables organizations to slide move toward the edge of innovation — where the most value occurs.

Microservice providers acknowledge there are tradeoffs when leading initiatives that require scale (multiple location installations), including the following:

  • Service discovery and documentation
  • Fault tolerance
  • Quality of service
  • Security
  • Request traceability
  • Failure triage

Start exploring the value of microservices

It’s always difficult when exploring new areas you’re unfamiliar with. Here are a few steps to help jump-start the journey of incorporating microservices into your healthcare environment.

  1. Identify potential microservices categories where you may find value.
  2. Define the scope of responsibility for the identified microservices.
  3. Consider the type of information that will be transmitted.
  4. Associate business processes with the technical functionality defined.
  5. Link technical processes to the business processes.
  6. Research capabilities that are ahead on the business road map — capabilities  that are not offered today but are desired. The following steps are typically done with tools, not manually.
  7. Design the micro service starting with the API definition and elaborate how the service will be consumed (REST or event-driven)
  8. Develop a service mocking or simulation. This step is also known by isolation, simulation or virtualization. In essence, you’re building something that works as something else.
  9. Deploy the microservice. This is where we transition from deploying to multi-tenant environments like JBoss AS or Tomcat and leverage IaaS automation frameworks such as HashiCorp or Chef and virtualization technology such as Xen or VMware.
  10. Manage container systems. Conflicts and container integration must be proactively managed. Pivotal Cloud Foundry and Mesosphere DCOS have recognized this gap and are evolving to address the need.

Best-of-breed healthcare solutions

This year, your team will identify new technical capabilities. They will assess how these skills will align to the predefined needs of the business. As a healthcare leader, what do you expect out of this analysis? What have we always expected? We expect a recommendation — a single recommendation.

When was the last time your team identified, assessed and presented options and the result was a set of five to eight products that worked together and provided a unified best-of-breed solution? I’d say it probably hasn’t happened in the past 30 days and likely not even within the past year. Whether you’re assessing a healthcare medical record solution or a pure desktop product used by clinicians, everyone wants simplicity.

Unfortunately, in today’s knowledge-rich world, one solution rarely provides all the answers. As a result, we “fit.” We fit our solution into whatever problem hole we find. The solution rarely fits the need perfectly, yet we just cram the solution into the problem space. The action correspondingly has a ton of white space where the solution didn’t solve the intended problem (business or technical).

Delivering the value of simple

The most logical microservices uses are attributable to business processes or transactions. Microservice responsibility goes beyond pushing data. Each service is discrete and encapsulates a set of responsibilities. These responsibilities may relate to a business domain such as claims or billing. However, they also could relate to technical domains such as operating systems or network performance.

The benefit of deploying microservices is the micro scale of functionality that is agonistic to a particular domain or subdomain such as claims reconciliation. The patient name, account number and balance may also be applicable across other business areas such as patient entry, patient discharge or utilization. Microservices begin with business-oriented designs commonly in the form of APIs (business interactions to access information).

Adaptability, loose coupling, autonomy, fault tolerance, composability and discoverability each offer the advantage of reuse — a core principle supporting the value of microservices design. Define the problem first. Savvy healthcare pioneers have already discovered that microservices help solve problems by getting back to simple.

Darwinian insights on innovation and competition

Variation, selection and competition are the challenges of navigating today’s digital ecosystem of value. Identify the struggle between individuals and competitors to discover tomorrow’s game-changers. Innovation is the modern struggle for existence. Will your organization survive?

Consumers buy your products, services and interactions for reliability. Partners sought your alignment for greater stability. Employees joined your company for predictable results. Disruptive innovation can be identified when best practices no longer produce predictable results. Our modern knowledge-intensive economy depends on organizational capabilities. Is your organization having trouble identifying why the margin is eroding? Disruption in disguise may be the answer.

Struggle for existence

Charles Darwin’s The Origin of Species is unquestionably one of the greatest works in human intellectual history. In this seminal book, Darwin develops the argument of why the theory of special selection is incorrect and why the theory of natural selection is more favorable. Eventually, reputable scientists arrived to acknowledge that evolution, the transformation of species over time, had in fact occurred. Darwin elaborated that variance is not an anomaly but rather an inevitable result of orchestrated processes. Causes of variability and the difficulty in distinguishing between varieties and species were not only challenges for Darwin. Today, a complex ecosystem of offerings makes the identification of value-based innovations difficult to delineate in markets with multiple offerings.

Buried under the struggle for existence, many innovators incorrectly assume that natural selection requires competition among individuals. Darwin defines this struggle not between individuals as competitors but in a metaphorical sense where predation, parasitism or environmental conditions dictate a new struggle. Natural selection eliminates competition. All modern innovation organizations should pay attention to lessons of selection in the struggle for existence — a modern struggle for variability through innovation and predictable results. Industries are looking less to their neighbors and more toward unrelated industries for innovation insights. You’re not competing with your business neighbor.

Natural selection redefining the rules

Industry leaders are searching to discover tomorrow’s game-changers. Will a new technology improve efficiency? Is the current business model changing? How do we compete tomorrow in this explosive sharing economy? There are multiple methods to ensure corporate survival. The accepted method favors players that evolve and adapt. The winners define new rules and establish new games.

This year will unlock opportunities — ones that were not afforded last year. The dawning of the new year also will bring challenges previously unseen. Start with these questions before you set your organizational agenda.

  1. Is your organization creating and capturing value?
  2. Does your organization not only find the right strategies but make good decisions when selecting future strategies?
  3. Is your organization in competition or cooperation? For example, is your organization building walls for the competition or establishing relationships with unlikely allies?
  4. Are you playing an old game, or are you redefining a new game?
  5. Has your organization clearly identified complimentors (the situation in which customers and suppliers play symmetric roles)?

Natural selection may preserve favorable variations and reject injurious variations. Like the natural selection of animals, all inferior businesses are not immediately destroyed; they evolve out of existence. Darwin suggested that natural selection is “the daily and hourly scrutinizing, throughout the world, [of] every variation, even the slightest, rejecting that which is bad, preserving and adding up all that is good.” Isn’t this happening in business — every hour of every day? The change we experience in business is natural selection. Consider the value your organization adds, as environmental conditions change. Is your organization evolving out of existence?

The evolution of disruption

Several mistletoe plants growing on the same branch of a host tree may struggle for existence. It might be truer that the struggle for existence is not against the thousands of seeds of the same kind, or against other fruit-bearing plants, but against any attempt to devour the seeds and thus prevent dissemination. Disruption is not an event; it’s evolution, a transformation of convenience. Aspects of your business are transforming as did cloud computing, consumerism and mobile — focus beyond the seeds of your company and observe the broader struggle for existence.

The innovation duel: game theory and product launch timing

Game theory has consumed innovation. The duel of innovation depends on timing, and timing depends on how you play the game. Take control of your game and play with new strategy in the new year.

Old west cowboys personalized the concept of honor. In a refined society, the art of politeness demanded the withdrawal from overt acts of violence. The solution? The western duel. Where do you stand? When do you start? What are the rules? The code of honor of a duel and the code of innovation have remarkable similarities. Let’s explore a few.

Game theory

Game theory is the study of strategy decisions. Understanding the other player’s perspective is central to game theory. Dueling swordsmen, dueling gunfighters and dueling innovators all have common ground. A strategy decision faces them all: to lead or not to lead?

Bonanza (1959-1973), The Rifleman (1958-1963) and Gunsmoke (1955-1975) rank right up there with the best TV westerns of all time.  We followed the adventures of Ben Cartwright on the ranch in Bonanza. We felt the struggle of Lucas McCain as he raised a son while battling the wild west of New Mexico in The Rifleman. And of course, Marshal Matt Dillon kept his eye on lawlessness in Dodge City with the help of saloon proprietor Miss Kitty Russell and Doc Adams in Gunsmoke.

Each of those great westerns had one element in common with modern innovation — the duel.

The swords of the 17th and 18th centuries were quickly replaced by pistols throughout the late 18th century and into the 19th century. Mainly practiced in early modern Europe, duels did make an appearance in the United States. Duels were not necessarily fought to eliminate the other person, but rather to restore honor by demonstrating a willingness to risk one’s life for something. Between 1798 and the American Civil War, the U.S. Navy lost two-thirds as many officers in duels as it did in combat at sea. Dueling was a big problem.

Fortunately, duels are no longer required to become a leader of the country, as was the case for Andrew Jackson in his bid to become the seventh president of the United States. Surprisingly, the game theory of a duel and a product launch are similar in many ways. When Andrew Jackson had to duel against Charles Dickinson, what was his strategy? How did Jackson determine when to fire? Dickinson was a famous duelist and a known marksman. Jackson determined to let Dickinson fire first, hoping that his aim might be spoiled by his quickness. Dickinson did fire first, hitting Jackson. Jackson carefully took aim and hit Dickinson in the chest, inflicting wounds that later caused Dickinson’s death. In 1832, the distance of a duel was 35 to 45 feet and both contenders were stationary. However, stagnation is not a characteristic of innovation — innovation is in motion. Playing the game of innovation revolves around timing.

The duel of innovation

In the Art of Strategy, Avinash Dixit and Barry Nalebuff explain the game theory of “The Safer Duel.” They explore how to plan a strategy for a duel.

Let’s reframe the topic of this discussion from dueling pistols to dueling innovators, because you probably don’t need a strategy for planning a duel but it would be useful to understand when to launch a product, a service or an interaction. If you launch too early, you’ll miss the market. Launch too late and the competition will eat you. When do you pull the trigger on your launch?

Think of two innovative companies that are miles apart but slowing walking toward each other — the duel of innovation. In this example, each company is launching a product (though it could be a service or an interaction) and both companies are capable of launching the product. In this example, each company would wait to launch until its probability of launching effectively is equal to the other company’s chance of a failed launch. The only factor that matters in determining a strategy is the ultimate chance of success. At the ideal inflection point, the probability of a successful launch is a half for the company launching and a half for the company not launching. Logical deduction tells us that survival is best achieved when at a distance (or timing) where the launching company has a half chance of success. Interesting isn’t it? As your new year begins, the survival of your company’s new product offering hangs on the timing of the launch. Too early you’ll miss the market. Too late you’ll be beaten.

Playing to win

While you may not subscribe to the theory of “playing to win,” it’s doubtful you support the theory of “playing to lose.” Game theory can help balance the risks of launching a new product after your competition. Whether you’re playing tic tac toe or planning the next launch of your company’s flagship product — deep down everyone wants to win. Acknowledge that you’re in a duel of innovation.

 

Innovation and emergent strategies for competitive advantage

It’s easy to overestimate your ability to rapidly build value. Building commercialization strategies helps to ensure sustainable competitive advantages with economic benefits.

Strategies broadly fall into two categories: Corporate strategy, or “where to compete,” and business strategy, or “how to compete.” Often companies trip and fall awkwardly somewhere between where to compete and how. Today, we’ll adjust that trip to more of a fashionable skip.

Innovation strategies are emergent, not planned.

Finding a strategy

Articles that highlight concepts of strategy are plentiful, but there are few that define the boundary between having a strategy and not having one. Donald C. Hambrick and James W. Fredrickson help us identify what a strategy is not. Outsourcing is not a strategy. Being a low-cost provider is not a strategy. Chasing a global footprint is not a strategy. Surprisingly, operational effectiveness is also not a strategy.

Then what is a strategy?

Strategy (n): the art of sustainable value creation to create unique competitive advantages to shape the perimeter of an organization.

Business strategy, operational strategy, marketing strategy and financial strategy all slide into the dirty fishbowl known as strategy. A strategy is about how people throughout an organization should make decisions and allocate resources in order accomplish key objectives. Hambrick and Fredrickson’s five major elements of a strategy, when applied to innovation, make strategy determination straightforward. Ask yourself these questions to determine if you have an innovation strategy:

  1. How is the innovation adding new value to the core ecosystem interaction?
  2. Are we building value rapidly? Are we creating a new value chain, or tapping an existing one?
  3. Is the product, service or interaction unique, or can it be imitated?
  4. How does our innovation provide a sustainable competitive advantage?
  5. Have we established a viable economic model for scale?

While peering into the fishbowl of strategy, it’s worth the time to assess whether your organization has a strategy. Innovation doesn’t have to be planned — it can emerge under a strategy umbrella.

Innovation from an emergent strategy

Henry Mintzberg was the first to come up with the concept of “emergent strategy.” Emergent strategy, also called realized strategy, is not intentional. In an article published in the Stanford Social Innovation Review, John Kania, Mark Kramer and Patty Russell say that emergent strategy gives rise to constantly evolving solutions that are uniquely suited to the time, place and participants involved. Emergent strategies can bend. They can break right off, and reconnect. They are not linear and can reconnect to form unique value-creating economic benefits.

Leaders track external and internal factors impacting shifts of customer behavior. Monitor these we should; they can impact strategy, business models and operational approaches. Emergent strategies acknowledge that oversimplification kills innovation. For example, defining a product aligned to a specific market segment or attempting to create scale on a foundation of underskilled, overworked and unmotivated resources are strategies oversimplified.

Curious if your organization is leveraging emergent innovation strategies? Answer these questions for a quick pulse:

  1. Do you have a mechanism in place to “sense” environmental business model changes?
  2. Are there processes in place for interventions for exogenous events?
  3. Have you established a co-creation approach for the next evolution of the innovation or your interaction experience?
  4. Is innovation predictable, or is the organization adapting and sensing as circumstances change over time — e.g., watching the competitors?
  5. Is the company dependent on decision-making frameworks, or does the decision system dynamic appreciate the human dynamics that can accelerate change?

Looking over your shoulder at competitors and copying innovation approaches will never result in sustainable competitive advantages. New directions are required.

Moving to an emergent model for innovation may be the answer.

First principle thinking for innovation

We could pigeonhole each company into one of eight types of strategies: planned, entrepreneurial, ideological, umbrella, process, unconnected, consensus and imposed. Mintzberg and co-author James Waters did a great job of articulating how strategic choice drives a strategy in a 1985 Strategic Management Journal article titled “Of Strategies, Deliberate and Emergent.” Rarely does an organization purely adopt a single type of strategy. More often, strategy development is a combination or hybrid of multiple types that together in orchestration establish the organization direction and intent.

Have you ever followed someone in a car going to an unfamiliar location? It can be difficult to follow the car. If they bank right hard and make a right, you have to whip your car to the right, while jamming on the breaks to maintain a reasonable distance. What happens if you lose sight of the other vehicle? The simple version of this scenario is when you lose sight of the car and become lost in an area you’re unfamiliar with and you don’t know which side street they went down or how they are getting to that destination. After all, if you knew how to get there, you wouldn’t have been following them. Then you pause and realize you have other options. You can call them and reset their position. They mention they’re waiting at the small café on Gordon Street, and you race over and continue to follow them until you reach the destination.

Following a friend in a car is similar to how businesses imitate the competition. What if you were unable to locate the car and then received a call that said the car was in Massachusetts. You’d likely quickly head to Massachusetts. But then you receive another call and learn that the car is in Florida. You’d then race down to Florida. However, if more and more reports started coming in saying the car was in South Carolina, Tennessee, Texas and Illinois, you’d realize that you can’t get to all those places. Let’s assume that each spotting of the car in this example was your awareness of a competitor’s strategy. For example, if they launched a new product, service or interaction, your organization could imitate it, but you have no idea of their “real” strategy. Chasing innovation doesn’t work.

Sun Tzu, the famous Chinese military strategist, stated 3,000 years ago: “All men can see the tactics whereby I conquer. But what none can see is the strategy out of which great victory is evolved.” When we look at companies, what we observe is where they have been or the level of success of their latest tactic, not the future strategy.

Original ideas are not born from assumptions in the past. They arise from the strategic and calculated alignment of ideas into seemingly abnormal combinations, creating new causes. Leading organizations use first principle thinking to maintain a competitive advantage. These are foundational propositions or assumptions that can’t be deduced from any other proposition or assumption. In short, you can’t just read and connect the lines. There are no lines to connect.

Whether you’re building the corporate innovation capability or driving results for a department, ask these questions to validate you’re using first principles and not imitating:

  1. Does most of your innovation come from inside the organization, and not from merely asking customers what they want?
  2. When a problem arises, is the first thought to apply approaches the organization has already leveraged? Or to deconstruct the problem to determine new causes?
  3. Are new ideas suggested based on successful strategies of the past? Or alternatively, are new ideas deconstructed into the most fundamental components and evaluated for new value potential?
  4. When you ask your leadership team what new strategies should be explored, do you first hear about what your competitors are doing, or do you hear new original ideas?
  5. Can your competitors link together your current tactics to determine your real strategy?

Solving complex problems? Emergent strategies address complex problems. If you first hear about what the competitors are doing, it’s probable that you’re following that car, waiting for a call that will never come. Build a team that can create new ideas that are not necessarily linked to what was done in the past. Emergent innovation isn’t linear.

Explaining the 21st Century Cures Act: tackling the challenge of healthcare interoperability

New advances in medicine happen daily. But despite spending billions of dollars, we haven’t been able to connect the network of healthcare providers in the United States. The 21st Century Cures Act may help.

The 21st Century Cures Act, or H.R. 6, reauthorizes the National Institutes of Health (NIH) and provides other funding to the agency through FY 2020. We are walking away from disease treatments and actively searching for cures — methods to resolve the root problem. New efforts rip off the band-aids of temporary relief to unmasking problems. Lifesaving and life-improving therapies have the potential to transform our quest for faster cures.

Genesis of the bill

To accelerate the discovery, development and delivery of 21st-century cures is the bill’s goal. Said another way, the goal of this bill is to save lives.

The 21st Century Cures Act, introduced by Rep. Fred Upton (R-Michigan), was overwhelmingly passed by the House. Here’s a timeline of its journey to the president’s desk:

  • May 19, 2015: Rep. Fred Upton introduces bill.
  • Nov. 30, 2016: The House passes the bill.
  • Dec. 7, 2016: The Senate passes the bill.
  • Dec. 13, 2016: President Barack Obama signs it into law.

The act has three pillars: discovery, development and the delivery of cures.

1. Discovery ensures that the NIH is provided with a total of $4.8 billion in new funding. This monetary injection includes $1.5 billion to advance the Precision Medicine Initiative to drive research into genetic, lifestyle and environment variations of disease plus $1.8 billion to fuel Vice President Joe Biden’s Cancer Moonshot initiative and support the BRAIN (Brain Research Through Advancing Innovative Neurotechnologies) initiative to improve our understanding of diseases like Alzheimer’s.

2. Development addresses the fact that while advancements of human genome mapping have been impressive, translating this into FDA-approved treatments has proved difficult. Modernizing clinical trials (analyzing the safety and efficacy of data), utilization of biomarkers (to assess how a therapy is working), and empowering the FDA to utilize flexible approaches to reviewing medical devices. This section also provides the FDA with $500 million for regulatory modernization to retain the best and brightest scientists, doctors and engineers.

3. Delivery helps to put newly tested and approved drugs into the hands of patients, at the right time. Interoperability of electronic health records systems for a seamless patient experience is the essence of delivery. Patients need access to their complete health profile (longitudinal medical record) to fully realize the benefits of a learning healthcare system. Rounding out the bill is healthcare provider education to empower seniors with the latest medical technology.

Interoperability

The bill is lengthy: It has 25 sections and runs to 996-pages. The Energy and Commerce Committee of the House of Representatives published a helpful section summary that carved out the intent while omitting the exact letter of the bill. Our discussion will concentrate on interoperability covered under “Title IV: Delivery” where interoperability is addressed in multiple sections:

  • Sec. 4001. Assisting Doctors and Hospitals in Improving Quality of Care for Patients: encourages certification of health information technology for specialty providers.
  • Sec. 4002. Transparent Reporting on Usability, Security, and Functionality: Calls for the creation of reporting system to gather information about electronic health record (EHR) usability and interoperability.
  • Sec. 4003. Interoperability: Supports the creation of a digital healthcare directory (a voluntary model framework and common agreement) to facilitate exchange and requires the HHS to defer to the private sector on health IT standards development.
  • Sec. 4004. Information Blocking: Grants the HHS Office of the Inspector General authority to assign penalties for practices that block the sharing of electronic health records.
  • Sec. 4005. Leveraging Electronic Health Records to Improve Patient Care: Encourages the exchange of health information between registries and EHR systems.
  • Sec. 4006. Empowering Patients and Improving Patient Access to their Electronic Health Information: Certification of patient-centered electronic medical records and promotion of health information exchanges to promote patient access.
  • Sec. 4007. GAO Study on Patient Matching: GAO study of methods for securely matching patient records to the correct patient.
  • Sec. 4008. GAO Study on Patient Access to Health Information: Authorizes the GAO to review barriers to access, healthcare complications and patient methods for requesting their health information.
  • Sec. 4009. Streamlining Transfers Used for Educational Purposes: Physicians are now exempt from reporting income received for the purpose of speaking or preparing materials for educational presentations on medical topics.
  • Sec. 4010. Improving Medicare Local Coverage Determinations: Decisions by a medicare administrative contractor (MAC) about whether to cover a particular service must be public on the website.
  • Sec. 4011. Medicare Pharmaceutical and Technology Ombudsman: Establishment of a new role to address problems relating to coverage of new and life-saving technologies.
  • Sec. 4012. Medicare Site-of-Service Price Transparency: Requires the public availability on a website of Medicare services for estimates of items and services and beneficiary liability (cost to the beneficiary).
  • Sec. 4013. Telehealth Services in Medicare: Establishes a bipartisan working group focusing on telemedicine to explore improvements for dually eligible conditions (Medicare, Medicaid and chronic conditions) that might improve with telehealth.

A future connected

The Energy and Commerce Committee published a useful fact sheet reviewing the bill’s goals of helping patients through biomedical innovation. The progressive pace of scientific advancements must get better at translating discoveries into cures for patients. H.R. 6 champions the quest for faster cures by doing the following:

  1. Removing barriers to increased research collaboration.
  2. Incorporating the patient perspective into the drug development and regulatory review process.
  3. Measuring success and identifying diseases earlier through personalized medicine.
  4. Modernizing clinical trials.
  5. Removing regulatory uncertainty for the development of new medical apps.
  6. Providing new incentives for the development of drugs for rare diseases.
  7. Helping the entire biomedical ecosystem coordinate more efficiently to find faster cures.
  8. Investing in 21st-century science and next-generation investigators.
  9. Helping to keep and create jobs in the United States.
  10. Reducing the deficit by over $500 million.

Feb. 17, 2017, will mark nine years since the $35 billion HITECH Act (Health Information Technology for Economic and Clinical Health Act) was passed. What has HITECH accomplished? It’s accepted that adoption of electronic health records is more prolific and that market competition dynamics have improved. Sharing of patient lab results, radiology reports and summary of care records on average improved 48.5 percent since HITECH passed.

The 21st Century Cures Act amends the HITECH Act by requiring providers of healthcare services to establish goals, strategies and recommendations by Dec. 13, 2017. Acknowledgment that patient-focused drug development is required is a positive start to finding better cures in 2017.

How to create a blue ocean strategy for healthcare with blockchain

Increased commoditization drives leaders to search for uncontested markets with the desire to make the competition irrelevant. Blockchain is creating blue oceans — industries that are not in existence today.

Blockchain can be used for organizational differentiation, by creating new market spaces with unique organizational positions leading to strategic advantages over time. We’ll explore how blockchain can unlock new blue ocean strategies for healthcare.

First, briefly what are blockchains?

In 2008, in response to the global financial crisis of 2007-08, Satoshi Nakamoto wrote a paper titled “Bitcoin: A Peer-to-Peer Electronic Cash System.” The paper suggested that “trusted third parties” could be eliminated from financial transactions using blockchain technologies. Blockchain makes it possible, for the first time in history, to remove — or disintermediate — the middleman from business transactions, and by doing so improves the value of existing products, services and interactions.

Value innovation for healthcare

The book Blue Ocean Strategy, by W. Chan Kim and Renée Mauborgne, introduces the term value innovation, focused on making the competition irrelevant through the creation of new leaps in value. Value innovation is uncovered when companies align innovation with utility, price and cost positions. The result has the potential to create uncontested market spaces, or “blue oceans.”

The authors of Blue Ocean Strategy evaluated 108 companies and found that 86 were product line extensions that accounted for 62 percent of total revenues and 39 percent of total profits. In contrast, the other 14 percent of companies, focused on blue ocean strategies and accounted for 38 percent of total revenues and 61 percent of total profits. Blue ocean strategies pay off. They can also pay off for healthcare. Blue Ocean Strategy was first published in 2005. However, this discussion will also include the more recently expanded 2015 edition.

Creating blue oceans

Using the framework for building a compelling blue ocean strategy, we will apply the Blue Ocean Strategy Canvas to healthcare across three areas:

  1. Where the competition is investing.
  2. Factors on which the industry currently competes on products, services and interactions.
  3. What patients receive for a standard market offering.

Through this visual diagnostic, I’ll present a graphical picture showing the strategy canvas for the as-is state of primary care, the healthcare industry, and the future state of healthcare with a twist. Based on this as-is state, we’ll extend this strategy canvas to show the future state in value-based blockchain world.

Strategy canvas for healthcare

The action framework initially helps to define the as-is state foundation for building a blue ocean strategy. It also provides clarity on where competitors are investing today and how patients receive (or don’t receive) value from healthcare providers. Let’s brainstorm the factors upon which the healthcare industry completes and invests. Using these levers may also increase efficiency and tune performance.

Hospital perspective

  1. Cost per procedure: Is the procedure costing more or less compared to other similar procedures?
  2. Clinical labor costs: What is the clinical cost per discharge?
  3. Administrative labor costs: What is the overall nonclinical cost per discharge?
  4. A/R days due to coding: How long does it take to code and submit a claim?
  5. Claims denial rate: Are claims rejected for inaccurate information?
  6. Readmission rate: Does the patient come back for the same reason as the initial visit?
  7. Inventory loss due to expiration: Is waste impacting operational margins?
  8. Outlier average length of stay: Is quality impacting reimbursement?
  9. Patient satisfaction: Is treatment quality impacting brand?
  10. Brand: Is the brand of the provider known?
  11. Compliance risk: Is the documentation trail neat?
  12. Appeals: Is clinical and nonclinical information accessible for legal appeals?

Patient perspective

  1. Price transparency: Does the patient understand the value for the cost?
  2. Price flexibility: Does the patient have price alternatives?
  3. Out-of-pocket costs: How much will this cost the patient?
  4. Appointment availability: Is the doctor available?
  5. Time with physicians: Does the patient have time to ask questions?
  6. Patient satisfaction: Is treatment quality impacting brand?
  7. Waiting time at the provider: Is the patient waiting?
  8. Most insurance accepted: Is the patient covered?

The 2016 Healthcare Primary Care As-Is Strategy Canvas provides an impressive view. This chart offers a graphical illustration where primary care is leading and where it’s lagging, observed from the perspective of the patient and the hospital.

Healthcare Primary Care As-Is

Broadening the scope to include specialty care, nursing care, and primary care offers new insights into how patients and hospitals define value.

Healthcare Strategy Canvas

The four actions framework     

Value innovation is not the same as technology innovation. Ultimately a technology to attract the masses must make consumers’ — or, in our case, patients’ — lives, more convenient, more straightforward or more fun.

The four actions framework works to break the trade-off between differentiation and low cost. Red oceans strategies focus on either differentiation or low cost, whereas blue ocean strategies concentrate on both. Let’s review the four actions briefly.

  1. Eliminate: Remove factors from industry standards.
  2. Reduce: Move factors below industry standards.
  3. Create: Generate factors new to industry standard.
  4. Raise: Move factors above industry standards.

This exercise helps to identify where to remove waste, reduce redundant processes, discover new value and amplify embryonic value. Focusing only on primary care, where could we refocus our energy?

Eliminate

  • None

Reduce

  • Waiting time at the provider
  • Administrative labor costs
  • A/R days due to coding
  • Claims denial rate
  • Compliance risk

Create

  • Truth, not trust
  • Appointment flexibility

Raise

  • Price flexibility

Creating a blue ocean strategy for healthcare is less about what patients need and more about what patients don’t need. It’s about eliminating waste and removing what is not necessary or required for a healthcare interaction. 

By filtering our chart to only include primary care, we can model where value can be extended and renewed — exploring blue oceans.

Healthcare Future State

Blockchain and blue ocean strategies

Healthcare is riddled with challenges and non-value-add processes. How can blockchain add value? The goal is adding value innovation, not technologyinnovation; we saw how that worked out for Motorola. Apple appears to be doing fine with the iPhone selling more than 230 million units that accounted for 66 percent of it FY 2015 revenues, $233.7 billion.

While value innovation is not the same as a technology innovation, technology innovation does have a value that can enable the discovery of blue ocean strategies. Using our healthcare strategy canvas and the four actions framework, we can identify areas ripe for blue ocean strategies that can benefit from blockchain technologies.

  1. Waiting time at the provider (reduce): Complex registration processes, repetitive registration processes create patient and provider friction. Using blockchain distributed patient registries (ledger) can reduce patient intake time.
  2. Administrative labor costs (reduce): Less paperwork to fill out at the provider location, results in less administrative staff required to process the paperwork; therefore, reducing overhead costs.
  3. A/R days due to coding (reduce): Providers will update health events in a blockchain type ledger that will enable other healthcare delivery entities (payers, third-party healthcare providers, labs) to access the same information reducing cycle time for coding and recoding due to errors.
  4. Claims denial rate (reduce): A distributed healthcare system registry (ledger) could house claims, decreasing errors due to manual entry or rekey errors.
  5. Compliance risk (reduce): Transparency of claim activity and the universal health event registry (ledger) will allow auditors to swim through claims from the beginning to the end, decreasing compliance risk and cost.
  6. Truth, not trust (create): We trust third parties, such as hospitals and urgent care facilities, to protect patient records. Tomorrow, a distributed patient registry (ledger) may house patient and claim information, improving transparency for claim processing and payment.
  7. Appointment availability (create): When an appointment is needed, a patient calls a provider and attempts to schedule one. There are typically several back-and-forth interactions before a date and time are set. Why? Often multiple services area required — for example, an annual physical that also requires lab work. A universal health appointment registry (ledger) could help align hundreds of schedules without requiring any single health care delivery system to own the health appointment registry (ledger).
  8. Price flexibility (raise): With the blockchain distributed patient registries (ledger), claims information would be available from providers through a distributed network. Do you wonder what the average price for a procedure is in your town, for your age, or at a few specific provider locations? That question can be answered.

Repetitive registration processes and the requirement for multiple signatures on duplicate forms is a waste of time for patients and providers. We need blue ocean strategies in healthcare — discovering uncontested market spaces from a blockchain mindset.

 

Peter B. Nichol, empowers organizations to think different for different results. You can follow Peter on Twitter or his personal blog Leaders Need Pancakes or CIO.com. Peter can be reached at pnichol [dot] spamarrest.com.

Healthcare colored with blockchain’s open-source foundation

Self-sovereignty and identity anonymity hold the code to unlock the potential for blockchain to change patient health. The architecture of the internet has changed forever.

Technological change forces economic growth. Technology extends the science of discovery and produces artifacts used in everyday life. It’s the small technical discoveries that make larger scientific endeavors possible. It’s also these seemingly unrelated breakthroughs that make their way into our daily lives.

Apparently, insignificant discoveries become significant

In the 1960s, NASA conducted an extensive test program to investigate the effects of pavement texture on wet runways. The goal was to better understand braking performance and reduce the incidences of aircraft hydroplaning. The result of years of technical scientific studies was that, in 1967, grooving of pavement became an accepted technique for improving the safety of commercial runways.

One of the first runways to feature safety grooving was the Kennedy Space Center’s landing strip. However, the applications of this technique extended well beyond NASA facilities. According to the Space Foundation, safety grooving was later included on such potentially hazardous surfaces as interstate highway curves and overpasses; pedestrian walkways, ramps and steps; playgrounds; railroad station platforms; swimming pool decks; cattle holding pens; and slick working areas in industrial sites such as refineries, meat-packing plants and food processing facilities.

If you asked a cattle rancher in 1970 if his work would be affected by NASA’s research on braking patterns exploring ground vertical load, instantaneous tire ground friction coefficient or free-rolling wheel angular velocity, the answer would probably have been an emphatic “not a chance.” Likewise, if you had told the workers on a road crew in 1970 that they’d be spending many years of their lives adding grooves to the surfaces of existing highways, bridges, and exit ramps, their response would have been less than welcoming. It would have been impossible to convince these professionals of the coming changes.

The impact of technology on daily life starts with scientific and technological discoveries that initially appear isolated or narrow in context. But we know better.

5 MIT projects to watch

The MIT Internet Trust Consortium, established in 2007, focuses on developing interoperable technologies around identity, trust and data. The consortium’s mission is to develop open-source components for the Internet’s emerging personal data ecosystem in which people, organizations, and computers can manage access to their data more efficiently and equitably. The goal is to build emerging personal data ecosystems for individuals and organizations. That ideological desire fits in nicely with the growth of blockchain technologies.

Currently, there are five cutting-edge MIT projects that could change the future of the internet: MIT ChainAnchor, (permissioned blockchains), Project Enigma (autonomous control of personal data), OpenPDS2.0 (a personal metadata management framework), DataHub (a platform with the ability collaboratively analyze data) and Distributed User Managed Access Systems (DUMAS) (a protocol for authorizing and accessing online personal data).

The white papers for each project are interesting to read. When thinking in a healthcare mindset, it’s easy to think of their applications to health and wellness. Here are links to the project white papers:

MIT ChainAnchor

The proposed permissioned blockchain system is in contrast to the permissionless and public blockchains in Bitcoin. The system addresses identity and access control within shared permissioned blockchains, providing anonymous but verifiable identities for entities on the blockchain.

When applied to healthcare ChainAnchor could, for example, enable participants of a medical study to maintain their privacy by allowing them to use a verifiable anonymous identity when contributing (executing transactions on the blockchains).

Project Enigma

Enigma is a peer-to-peer network that allows users to share their data with cryptographic privacy guarantees. The decentralized computational platform enables “privacy by design.” The white paper says that, for example, “a group of people can provide access to their salary, and together compute the average wage of the group. Each participant learns their relative position in the group, but learns nothing about other members’ salaries.” Sharing information today is irreversible; once shared a user is unable to take that data back. With Enigma, data access is reversible and controllable. Only the original data owners have access to raw data.

In the context of healthcare, patients could share information regarding personal genomics linked to disease registries and clinical treatments aligned to healthcare outcomes, knowing that their original data was not shared.

OpenPDS2.0

OpenPDS introduces SafeAnswers, an innovative way to protect metadata (application, document, file or embedded) at an individual level. As the white paper explains, SafeAnswers, “allows services to ask questions whose answers are calculated against the metadata instead of trying to anonymize individuals’ metadata.” SafeAnswers gives individuals the ability to share their personal metadata safely through a question-and-answer system. Previous mechanisms for storing personal metadata (cloud storage systems and personal data repositories), don’t offer data aggregation mechanisms. The challenge is that once access is enabled, the data is broadly accessible. SafeAnswers reduces the dimensionality of the metadata before it leaves the safe environment, therefore ensuring the privacy of data.

Healthcare metadata examples could include patient account number, patient first and last name, and date of admission. Healthcare research could benefit from using aggregated metadata from patients without sharing the raw data. Research entities would send questions to an individual’s personal data store (PDS). This PDS would respond with an answer. Today, if metadata information was provided to researchers or accessed from a phone application, the patient could disable (uninstall) the app but wouldn’t know what information was shared. With the SafeAnswers system, a patient potentially would use their PDS URL to provide access to the health app. All of the patient’s metadata, therefore, could be tracked and recorded — visible to the patient. Later the patient could access metadata the application was using to create patient inferences. Also, the patient could permanently remove the metadata the application was consuming by either limiting or permanently restricting future access. No trusted third party. No entity to monitor access. Anonymously shared data.

Discoveries that transform society

The DataHub project and the Distributed User Managed Access Systems (DUMAS) projects offer additional pieces to solve the challenge of exchanging information while maintaining identity anonymity. Maybe they can apply to healthcare – if we’re creative.

Highly technical advances have shaped the social economy for centuries. The creation of the sickle, a handheld agricultural tool with a curved blade typically used for harvesting grain crops, had a profound impact on the Neolithic Revolutions (Agricultural Revolution). Who would have imagined when it was invented (18000 to 8000 B.C.) that the sickle would form the basis for modern kitchen knives with serrated edges.

Small, seemingly insignificant discoveries transform societies. How blockchain technologies will affect people on a daily basis is awaiting to be discovered. When blockchain applications enhance our lives, they may become as commonplace as highway safety grooving.