Cloud Lowering Barriers Use Cases

From PHUSE Wiki
Jump to: navigation, search

Cloud Lowering Barriers Use Cases

Introduction

This page captures some example use cases for the adoption of cloud-computing technology and approaches.

Use Case - Adverse Events and Cloud Computing

Presenter: Karen Carr-Gearing (on behalf of Adrian Cottrell) from GSK

Patient Safety is the highest priority for every pharmaceutical company. Robust processes must exist to ensure the proper identification, collection, analysis and reporting of Adverse Events. Timely management of Adverse Events resulting from marketed products or clinical trials enables regulatory compliance and patient safety. General business process challenges include:

  1. Managing the Adverse Event case volumes efficiently and effectively
  2. Short cycle times, especially for Serious Adverse Events
  3. Responding to the ever-changing global regulatory requirements
  4. Protecting the highly confidential data

In response to the above challenges, pharmaceutical companies must have robust IT systems supporting this mission critical business process. Systems must have high availability, high quality and be adaptable to changing regulatory requirements. System must also comply with the evolving data privacy requirements. Many companies are moving to COTs products; yet costs remain high due to licensing and internal hosting/infrastructure. Each pharmaceutical company works to the same set of regulations and requirements. Sharing of infrastructure (cloud) and/or IT systems (SaaS) would offer synergies and savings.

Discussion Points

  • Use of Cloud Computing:
Given the importance and confidentiality of this data, pharmaceutical companies are reluctant to place un-anonymized data outside of company firewalls. Any experience with this data on the Cloud?
  • Experience:
SaaS services for Adverse Event are increasing and have been considered by some companies. Maturity of the SaaS services and risk are key factors in the decision to host systems internally.
  • Barriers that have been overcome:
There is a range of opportunities to reduce risks and barriers such as Private Cloud, Single-Tenant solutions, Encryption, etc. Would these effectively mitigate the risk of Adverse Event data on the cloud?
  • Barriers Anticipated:
Internal Risk Managers, Privacy Officers and Business Owners of the Adverse Event Management process would require significant justification for placing AE data on the cloud.

Summary

Adverse Event Management is critical to Patient safety and compliance. Each pharmaceutical company must meet the same regulatory requirements yet maintains expensive in-house IT infrastructure and systems. We remain reluctant to embrace Cloud or SaaS solutions due to confidentially of Adverse Event data. It would be beneficial to have an agreed industry position on the use of Cloud technology for Adverse Event data.

Use Case - Batch processing of Data Using Cloud Computing

The technology behind cloud computing opens a number of avenues for data processing that defy traditional paradigms. Processing large quantities of Data can place extraordinary demands on a traditional IT infrastructure. The amount of data to be processed needs to projected and planned for from a perspective of data storage and data processing. This can be a nightmare from a perspective of administration and capacity/requirement planning - what is needed today can be planned for, but as the projections get further into the future it becomes progressively harder to get right. As our datasets get larger this problem is not going to get smaller.

The cloud provides a computing resource that can exist only when it is needed. Instances are entirely disposable and a company is only changed for actual use of the system, rather than having a set of hardware depreciating away within their data centre. Cloud providers have innovated around their platforms creating technologies such as AWS auto-scaling - this particular innovation works well for e-commerce vendors as it automatically scales capacity up and down dependent on requirements/demand. Can we use this scaling up and down to drive return on investment - only paying for what we need, when we need it? There are multiple dimensions to the computing resource as well, we can increase the number of processing nodes or increase the throughput of individual nodes by changing the CPU/Memory/Persistent storage type. You can tweak your requirements without spending a lot of time contemplating it; jobs run too slowly - add more nodes, nodes constantly overloaded - use larger instances. Not having to buy the hardware gives the organisation much more flexibility in what they adopt.

Hadoop is the open source implementation of the Map-Reduce paradigm authored by Google. Map-Reduce provides an ideal environment for parallel processing of datasets and Hadoop handles all the coordination (sending jobs to nodes, getting responses/data, handling node failure). This can be a complicated system to setup inside a corporate network, and often requires specialist IT abilities to initiate and maintain. Ideally, computing could be a black box - we have data input, programs to manipulate the data and then results come out. The cloud makes this possible, by making the environments accessible with minimal setup overhead - with a few clicks you can have a fully featured Hadoop instance ready to process your data. This instance includes all the necessary monitoring and coordination that you could need without requiring any specific expertise and can be updated as you need.

There are existing solutions for automated deployments of Cloud computing environments (using platforms such as Chef or Puppet) that can be included into a validated build, giving you a standardised development environment. APIs are a central tenet of the cloud, and people using the technologies can easily build frameworks around the cloud provider's offerings to integrate with internal processes and systems. Many integration providers already provide the 'plumbing' for moving data transparently and securely between private networks and the cloud.

The essential transiency of the cloud ensures that data is only there long enough to be processed and when the job is done the environment and input data is deleted. If you need the data to persist then there are data warehousing solutions available in the cloud as well, often well integrated with other cloud-based components. As more companies adopt the cloud other solutions appear, such as integrated enterprise reporting environments.

Other industries have been able to leverage the cloud for computing of large datasets - we surely could benefit from this as well.

Discussion Points

  1. What platforms are currently being used (within and without the the Pharma Industry?
  2. What issues are there with using this type of Data Processing?
    1. Logistical
    2. Practical