Information Faster Blog

Addressing incompatibilities in healthcare IT with money from the stimulus bill

By Richard Shelby Dunlap

$787 billion dollars.  Unless you’ve not turned on a television, a radio, or picked up a paper in the last week, you already know this is the amount of money attributed for the latest US economic stimulus bill signed into law a couple of days ago.  Ten billion of this stimulus will go to the National Institutes of Health (NIH), with $8.2 billion going to the NIH director for his own discretion.  A further $17 billion in incentives are included for health care providers to adopt electronic health records (EHR). 

That’s a fairly large amount of money earmarked for a generic term, and I might say somewhat confusing as I expect a sizable portion of the health care providers in America are already using, or planning to use, an electronic medical record (EMR) system to track patients.  The Certification Commission for Healthcare Information Technology (CCHIT) in fact has certified dozens of products in the EHR.  The industry is not in need of EHR per se, rather it is in need of compatibility and interoperability.  Hospitals have an inordinate number of systems, ranging from patient admittance, to x-rays, and to billing.  It is a wonder sometimes how the entire system manages to function, efficiently or not. 

Americans live with incompatibility in our daily lives.  For example, this Tuesday, Feb 17, was the originally proposed deadline to switch from analog to digital television (Congress recently moved this date to June 12), a change that will prevent some 5.8 million US households, or 5.1%, (according to Nielson Co.) from receiving television over-the-air (the old TV and rabbit ears).  5.8 Million households.  That’s a fairly big incompatibility problem, and to be fair, the analog to digital conversion has been in the works since 2005.  So those US households have had a fair amount of opportunity to find any number of possible solutions to this problem.  Nevertheless, I’d hate to see that many household without access to news and entertainment.

Another example of incompatibility we suffer from is in the cellular technology arena.  US consumers have four major competitors in the cellular market.  The four generally use differing technologies for the transmission of cellular data (voice or otherwise), and differing frequencies within those technologies.  To make matter worse, those frequencies will differ from region to region in the world.  Buy one of those new 3G GSM-based phones in the US, travel to Japan, and find yourself unable to find a signal.  Take that same 3G phone to Europe and, depending on country, find you’re surfing at EDGE or GPRS speed (a much slower legacy technology).  Finally, just try and switch carriers and take your new wiz-bang phone with you.  Only two of those carrier use similar basic networks (but not the same hi-speed frequencies!).

In the end, I’d much rather deal with those kind of incompatibilities than those in the healthcare system.  The ones in healthcare ultimately cost me dramatically more than having to buy a new digital converter box or a new cell phone.  I only wish it were that simple.

In 2005, the United Kingdom began its journey towards a centralized EMR by 2010, as they recognized the EMR was the best direction to see further progress and a better patient experience.  Perhaps now is the time for the US to use this stimulus to incentivize health care providers, and the application vendors that run the systems they use, to settle on some universal standards that will rid us of some of these incompatibilities to yield greater mobility, and just hopefully, a more efficient and productive system for patients. 

In unrelated news, I just spent the weekend in San Diego, CA, taking in the 2009 USA Sevens Rugby tournament, the latest round in the iRB Sevens World Series 2008/09.   Congratulations go to the team from Argentina who took the crown today (Sunday).  The USA Rugby team also deserves praise for reaching the Cup semi-finals for the first time, where they were edged out by the day’s ultimate winners.  I personally would like to thank all 16 teams that competed at this year’s event, and gave the entire crowd a fantastic show!  I can hardly wait for next year’s event.


RSNA 2008: 5 questions healthcare IT should ask exhibitors

By Lisa Dali 

  1. How can I guarantee that PACS images stored in a long-term storage infrastructure are secure?

  2. Can a single vendor’s solution be used to build an enterprise image and storage management environment?

  3. How does a PACS storage system help me improve patient care in my organization?

  4. What are the pros and cons of depending on an HSM scheme for image management?

  5. How does your solution help me reduce recovery time and recovery point objectives?

Visit HP's booth #6622 at RSNA 2008 to learn how HP Medical Archive solution can help you improve patient care, facilitate compliance, reduce RPO and RTO, and enable better clinician collaboration!

RSNA 2008: 4 tips to maximize time with storage software exhibitors

Tip #1: Determine how the vendor can help you create an enterprise image and storage management environment.  This is an environment that allows you to virtualize connected devices and the storage underneath so your clinicians can collaborate better and faster.Tip #2: Determine how the vendor’s solution can help you distinguish between transactional data and medical fixed content.  This distinction will help you begin to align business and clinical value with storage media so you can grow PACS storage tiers appropriately.  This is an essential component to building an enterprise image management environment.Tip #3: Determine how the vendor can help you consolidate long-term storage of data from multiple PACS and imaging applications so you can reduce storage management costs across your enterprise.  To be part of an enterprise image and storage management environment, it’s imperative that the consolidated environment be able to neutralize disparate data formats for true collaboration in a heterogeneous environment.Tip #4: Visit HP Information Management Software at booth #6622 to learn how the HP Medical Archive solution can help you build an enterprise image storage management environment.

Parallel Fetch: What is it, and how it might help your workflow...

By Shelby Dunlap

The HP Medical Archive Solution (MAS) integrates with dozens of PACS systems, big and small.  For those PACS systems that store their studies via the Common Internet File System (CIFS) and/or Network File System (NFS) standard, these studies will be stored as a single container (containerized) or as the individual images that comprise the study (non-containerized).

HP MAS incorporates a Gateway Node that allows virtualization of the grid-based storage that resides behind it, and in doing so it stores cached copies of recent images locally for faster retrieval.  If a PACS system attempts to retrieve an image that is not stored in this local cache, the Gateway Node must retrieve the image from the permanent storage.  This “cache miss” of course will have some associated overhead to it, let’s say 100ms, which would be reasonable if only this image or a few images.  If however the PACS’s normal operation is to retrieve the entire study, and let’s say it is a 125 MB 64-slice CT scan containing 250 files, then the total overhead from the cache miss could be upward toward 25,000 ms. 

Now it’s important to recognize that when a PACS system is retrieving an image or a study from long term archive for review, it is common practice for those images to then be stored in first tier storage again, or some higher speed storage device.  The process of doing so, and all related database activities that might go along with that operation, are often longer than the previously mentioned overhead in our archive system. 

For this example, let’s assume that would be 10 ms, and during this time HP MAS is idle.  If we know that the PACS system will retrieve the rest of the images from this study, we can utilize a feature in the HP MAS called Parallel Fetch.  Parallel Fetch will streamline retrieval of additional images within a given study when the first image is retrieved to speed up viewing exams for some PACS applications. This is advantageous for PACS which do not containerize all images in a study prior to archival  and where there is overhead in the PACS system during retrievals. 

Quite simply, when the first image in a given store path is retrieved, and Parallel Fetch is turned on for that store path, the HP MAS will automatically begin retrieving and build local cache copies for the rest of the images on the Gateway Node.  Since each image in this sample study has a 10ms overhead on the PACS side, this means that for at least for every 10 images, we have preemptively cached the next image before it has been requested.  In reality, networking and CIFS/NFS overhead can add a dramatic effect on the transfer of each image, adding in additional idle time for the MAS, compounding the impact of Parallel Fetch in this environment. 

So, in summary, if your PACS system performs retrieves of entire non-containerized studies, you may be a candidate for the HP MAS Parallel Fetch feature.

Learn best practices to improve your medical archive environment

On November 6, HP Information Management Software is providing healthcare IT managers, directors, CIOs, PACS Administrators and Clinical Department Heads with an educational opportunity that will teach you how an enterprise image storage environment will help you meet the 4 new accountabilities created by HIPAA that we blogged about recently.

REGISTER NOW!  Enterprise Image Storage: Moving Beyond the Department SiloTHURSDAY, November 6: 2 pm EST; 1 PM CST; 11 AM PSTListen and learn from an in-depth presentation from Healthcare IT industry expert, John Koller, as he discusses the future of image and data storage in the healthcare provider world.  Mr. Koller will address today’s mounting pressures for image management, archiving, and distribution.  He will also discuss options for consolidation across the healthcare enterprise.  The deep dive into the Clinical Information Lifecycle Management strategy will show you how enterprise image management will enable healthcare IT to facilitate compliance, reduce costs, and improve patient care.

Key points to be discussed:
Digital healthcare enterprise—Overview

Effects of imaging department growth on the output of image retrieval
Data Lifecycle Management vs. Information Lifecycle Management
Enterprise storage consolidation—A proactive approach

The silent killer of healthcare IT data centers

For more than two decades, the month of October has been known as a "cancer awareness month” for one of the deadliest cancers that strikes women (and men).  This blog won't allow me to say it by name, but public displays of the color pink, from ribbons to cookware, symbolize the fight against this “silent killer” and underscore the decades of healthcare research that continues to generate patient data requiring long-term storage.  Some data is highly valuable from the beginning and remains so for long periods of time, whereas other data types may not be of clinical value until much later on. 

In the U.S., retention requirements vary by state and assorted factors, such as procedure, demographics, and diagnosis.  Oncology (cancer) studies, especially mammogram imaging studies, are frequently in a special category that can require longer retention to enable clinicians and researchers to review past studies in case something is found later on.  In Europe, where there are no HIPAA-like laws yet, this "pink ribbon" cancer is setting a standard for long retention durations.  A few years ago Finland, Sweden, The Netherlands, Iceland, and the United Kingdom implemented nationwide mammography screening and retention programs.  Finland paved the way, requiring that their more than 2.6 million females between 50 and 70+ have a mammogram every two years with a retention period of 50 years.  So healthcare IT must ensure that these studies, which can be upwards of 160 MB each, are retained, preserved, secure, intact, and most importantly, accessible for 50 years.

Situations like this underscore healthcare IT's “silent killer”: hardware obsolescence.  Regardless of the data value on day one or day 18,250 (50 year mark) data has to be readable any time, any place, anywhere.  At the enterprise level, healthcare provider organizations must architect long-term imaging and archival storage environments that enable them to provide clinicians with 24x7 access to data wherever needed.  The following six elements are what a long-term medical information management environment should be in order to provide data accessibility and protection from hardware obsolescence:

1)      Integrated and unified: The environment needs to be forwards/backwards compatible to enable continuous access to information as software versions and hardware change.  New software versions need to be validated by the software vendor to run on older hardware with support from that software vendor.  This will help ensure that data remains readable as the devices that created it and the media on which it was originally stored are replaced.

2)      Open: Healthcare data-generating environments are heterogeneous by natural selection. But, by implementing software technologies that can both communicate with multiple types of medical devices simultaneously and neutralize disparate data formats you will simplify data migration from older to newer technology and enable collaboration between departments, facilities, and locations. 

3)      Scalable on-demand: The environment should be modular so you can start with what you need and integrate additional modules/components into the unified system over time.  Combined with the open standards nature of the medical information management environment, this will reduce the pain of migration to new devices and enable you to capitalize on Moore’s Law.

4)      Performance-centric: The underlying archival storage environment must support the top layer of medical devices (e.g., PACS).  As the top layer changes, it is imperative that the archival storage (or support) layer remains compatible with it to ensure that performance of the hospital information systems (HIS) is maintained and that access SLAs to clinicians are met.

5)      Data center-efficient: Last, but not least, it’s imperative that you can upgrade and streamline the medical information management environment in ways that allow you to reduce power, cooling, and floor space where possible.   Having an environment that is unified, open, and scalable on-demand is an essential enabler of data center efficiency.

In 2008, the 19th Annual HIMSS Leadership Survey showed that high quality patient care and patient safety continue to be top of mind business objectives that influence technology investments.  Given that fact, an imaging and archival storage technology environment with the elements described above will support these objectives.  It will propel research efforts to cure diseases by ensuring that data, regardless of format, age, or original storage media can be read and evaluated at any time as part of a study going on today or well into the future.

In healthcare we trust?

I just spent five days in two hospitals.  Not doing marketing research as HP’s Medical Archive solution (MAS) product marketing manager, but as the daughter of someone who needed emergency cardiac surgery a few days ago.  These past five days changed both my family and my perspective as a healthcare marketeer.  After spending years in the healthcare provider industry marketing software solutions for medical image archival storage, I saw first-hand how hospitals can struggle in emergency situations to manage the mix of challenges including patient care and safety, compliance (e.g., HIPAA), and remote doctor collaboration.

In the U.S., HIPAA’s Security Rule (2005) mandates that healthcare organizations have a contingency plan for emergency situations pertaining to paper and electronic personal health information (PHI) records.  If access is not available, then risks to patient safety can rise quickly along with risks of non-compliance.  With the somewhat vague nature of laws like HIPAA, contingency plans can consist of any mechanism that resolves the problem.  But, piecemeal solutions for emergency patient data access can still put patient care at risk with longer wait times and ultimately increase medical image storage management costs.  Here’s how I saw that from the other (e.g., non-marketing) side this week.Last Friday afternoon the local 45-bed Veterans Administration (VA) hospital that originally admitted my father did an echocardiogram (ECG).  This data was part of the acute treatment plan, which included surgery the next day.  As such, clinical value of this imaging study was very high.  A few hours later we were transferred to our city’s leading referral hospital, a 577-bed health system in Northern California and the only level one trauma center in California.  But, guess what wasn’t transferred?  The ECG imaging study.  Only the ECG report arrived at the new hospital.  This trauma center health system covers 33 counties and more than 65,000 square miles for 6 million people.  It’s only 10 miles away from the VA hospital and works hard to achieve a main objective of keeping the region’s preventable death rate at or below 1% (half the national average).  Yet this large hospital doesn’t have the infrastructure to access critical (high value) images that are less than 10 hours old and 10 miles away.  Because of this they were forced to deploy a resource-consuming contingency plan by doing a second ECG the same night.

This contingency plan is within the realm of acceptable per HIPAA and it solved the problem of incomplete patient data.  But it demonstrated the need for hospitals to streamline healthcare IT with a remote image sharing/collaboration environment.  As I discussed in a recent blog entry on best practices to lower healthcare storage TCO a few weeks ago, development of an image management and sharing environment where disparate data formats across locations are neutralized is key to: improving patient care, speeding data access, enabling online collaborative treatment, and improving compliance contingency plans.

HP Information Management Software has been working with image management layer (IML) software vendors to enable HP MAS to be the foundation for unified medical fixed content archival storage across disparate facilities and sites.  With this integration, HP MAS gives healthcare providers enterprise-wide access to patient information from a common repository (e.g., not piecemeal) regardless of the spectrum of imaging applications in the environment.  When clinicians and researchers within or across hospitals can access highly valuable data in emergency situations quickly, they are able to reduce wait times, collaborate faster, improve diagnoses and treatment plans, and meet objectives to keep preventable death rates as low as possible.   For more information on how HP MAS and IML integration can unify your healthcare IT environment, improve SLAs, and facilitate compliance visit

My name is Lisa Dali and I approve this message.

Secure your medical images now--Learn how!

Hurricane Ike and flash drives have more in common than you might think.  If you are a healthcare IT manager, CIO, PACS administrator, or clinical department head, then these data loss mechanisms can spell disaster should your hospital or imaging center be unprepared.  Data loss in healthcare happens more frequently than organizations would like to admit (see recent news headlines).  All too often healthcare organizations across the world gain unwanted PR when a data loss or security breach turns them into an overnight headline.

Whether it’s loss of patient records due to a natural or man-made disaster, healthcare providers need to implement data security measures that safeguard them from patient safety risks and bad publicity while helping them meet governance regulations.  But, understanding how to translate and apply data security and compliance regulations to your organization so you can ensure confidentiality, integrity and accessibility of patient data is tough.

That’s why HP and our partner Iron Mountain joined forces to provide you with an educational webcast that will teach you how to: mitigate risks to patient safety, maintain high availability to patient data, and ensure fast recovery when disasters of nature or man strike.  Register for the webcast today and download a complimentary Frost & Sullivan healthcare article on disaster preparation so you won’t become an undesirable headline.


Educate yourself today!  “Leading Strategies: Keeping Your Medical Image Data Secure”

 Learn best practices for:

  • Data security

  • Governance

  • Disaster recovery

  • Managed services

3 ways to reduce medical image storage TCO

Keeping storage capacity ahead of demand is nirvana to healthcare IT managers, CIOs, and PACS administrators.  With the average hospital running ~150 applications, generating between 60,000-500,000 new imaging studies per year, and requiring ~60 TB of storage, long-term costs for managing patient data are rising.  Frost and Sullivan report that storage hardware is only 25 percent of the total cost of managing medical information. There are myriad factors involved that contribute to the bulk of storage TCO.  A big factor increasing non-hardware costs is full time equivalent (FTE) resources spent managing manual processes, disparate data silos, and piecemeal storage solutions.

Here are three ways you can reduce storage TCO to improve care, compliance, and collaboration.

1)      Consolidate.  Reduce storage silos by consolidating data from multiple imaging applications (e.g., PACS) into a long-term archive.  The archive must be integrated and not a piecemeal solution that you will have to build and individually manage.  Meet SLAs to your doctors, as consolidation across single or multi-site organizations gives them faster access to patient data.

One step further: Ensure the archive communicates with applications via open standards and does not modify data formats in a proprietary way.  This is key to ensure that future applications can be cost-efficiently integrated. 

2)      Tier cost-effectively.  Ensure that multiple storage tiers can be integrated into the consolidated archive.  Employ data discovery and classification techniques to understand the clinical value of the data to select the right tiers as data changes.  If the tiers can be integrated by third party software but not managed by the archive, storage TCO will go up as FTEs mange the pieces.

One step further: Develop business policies that the archive can automatically mediate to move data between integrated storage tiers as value changes to your organization.  This will enable FTE resources previously spent on data migration to be reallocated, reducing storage TCO.

3)      Streamline IT: A centralized management console will streamline IT operations regardless of the archive configuration.  The built-in console should give your IT staff a centralized view from anywhere on the network of storage utilization per imaging application, site, and integrated storage resource, and full insight into the hardware and software status of each archive component.

One step further: Integrate your long-term archive with image management software to neutralize disparate data formats.  This will give your clinicians access to all imaging data across the enterprise, enabling them improve collaboration and patient care.

Healthcare Report Card: “3 D’s” for medical archiving

By Lisa Dali 

One billion served per year.  What’s that?  No, not the latest regional number of McDonald’s patrons.  It’s the number of new medical imaging studies that is expected to be produced each year in the U.S. alone by 2014.  The volume of medical image data accumulating in healthcare provider data centers is growing exponentially and it’s largely caused by the three “D’s” of healthcare: density, demand, and duplication.

Density of medical imaging studies has increased significantly over the past few years.  Three years ago the average diagnostic X-ray was about 40 megabytes per study.  Today mid-size hospitals generate upwards of 300,000 imaging studies per year, maintain roughly 150 different applications, and require about 60,000 GB or 60 TB of storage.  The healthcare technology “Big Bang” has produced state-of-the-art imaging modalities that generate images up to 500 megabytes per study.  While the density “Big Bang” benefits hospitals and imaging centers with efficiency and care delivery improvements, it leaves IT managers with a new average imaging study size to battle: 100 MB per study.

Demand for access to large and small imaging studies continues to mount.  Doctors require immediate and fast access to patient information because patient care is in jeopardy without it.  Governance regulations, such as HIPAA in the U.S., exacerbate the demand pressures by requiring that healthcare providers keep patient information for very long periods of time.  The fact that data value changes and can resonate between high, medium, and low based on several variables leads many healthcare providers, including every one I’ve ever spoken to worldwide, to institute their own retention period: forever.

Duplication of medical image data is, indeed, a pandemic.  Frost and Sullivan projects that U.S. imaging centers alone will require 100 Million GB or 100 PB of storage by 2014 for just a single copy of all their imaging studies.  While storing two physically separate copies of patient data is a best practice when it comes to disaster recovery, the word “duplicate” is quickly being replaced by the word “triplicate” when it comes to patient data.  The quarantined data pools that populate the healthcare provider landscape make it very tough to determine how many replicas exist.  Even tougher is the hunt to find unnecessary replicas and either move them to lower cost media more appropriate for long-term archival storage or, dare I say: delete them.

Getting even one “D” in school was bad enough so what’s the treatment for “3 D” syndrome?  HP has it with our Medical Archive solution (MAS).  HP MAS is a multi-tier archiving appliance that helps healthcare providers eliminate the pains of the “3 D’s” by providing them with:

  • Rapid access to medical images

  • On-demand scalability to keep storage capacity ahead of demand

  • Governance facilitation with regulations around disaster recovery, business continuity and privacy

  • De-quarantining of silos to reduce total storage TCO

So how about getting an A+ from the CIO vs. fighting the "3 D's" above?


Hospital IT Survives Powerful Gustav

By Lisa Dali

We just saw the resilient Southern United States quickly react to thwart a potentially major natural disaster.  Gustav’s might was averted, in part, by Mother Nature, and, in part, by business continuity practices employed in key life and death places.  Naturally the media focuses on the threat to life and limb, which is completely understandable, yet ironic considering how many reporters are sent into the storm’s path.  Of course, the major focus of everyone directly and indirectly involved in a situation like this is on saving lives, as it should be.  But, in order to save lives out in the field, we (first) need to think about how to save the life of healthcare IT and maintain its heartbeat so that the injured residents (and reporters!) can get the emergency medical care they need.

Today I read an interesting story in Healthcare IT News about a hospital in Baton Rouge, LA that was able to retain access to patient data and maintain 24x7 operations to treat injured residents.  While most of Louisiana was without power due to Gustav’s category 2 muscle, Ochsner Medical Center was able to keep the lifeline of the hospital up and running with backup power.  They had 100 inpatients undergoing treatment and were able to quickly respond to the influx of Gustav victims because of the disaster recovery mechanisms they employed in the aftermath of Hurricane Katrina in 2005.  They were able to get access to electronic medical records because they maintained the lifeline.  This is not only critical to treat the inpatients, but also to get access to patient history for the acute patients.

Compliance (e.g., HIPAA), is a major challenge that healthcare organizations must deal with when developing their IT infrastructures.  Disaster recovery is a major component of such governance requirements, and I thought it was great to see this story in the news because it highlights the essential disaster preparedness that any life and death facility must have, regardless of location.  I thought this was a great story because it’s essentially the media focusing on the lifeline responsible for saving human lives.

The iPhone of Medical Archiving

By Lisa Dali

The iPhone has revolutionized access to information.  We can get on the Internet, rock out to some old school Foreigner, share pictures, tune into YouTube, text and talk from one device.  In fact, I’d bet there are a few of you sitting in a Starbucks reading this from your iPhone right now.  Why did it have such an impact to the cell phone market, besides the coolness factor?  It’s simple.  It’s got everything you need in one fast device.  

Simplicity is a great concept, one that HP has been applying in the health and life sciences market with the HP Medical Archive solution (MAS) since 2005.  In fact, MAS is really like the iPhone for medical archiving.  No wait, it’s better than that.  For one thing, we don’t have battery life issues!

HP MAS gives healthcare organizations: Fast access, high availability, and obsolescence protection in one fast archive.  MAS lets you simplify management of medical imagery data from numerous applications and sites, and consolidate long term archival storage into one system.   See where I’m going here?  As imaging technology advances, the capabilities of HP MAS to manage access to information from a single repository are key for lowering costs, especially in the data center.  Clearly, floor space is a premium, and power and cooling costs -- skyrocketing in today’s economy -- must be controlled and reduced.

Like the iPhone 3G, we too streamlined HP MAS this summer.  MAS 3.5, launched in July, has been optimized to improve the efficiency of your healthcare data center.  We have integrated very compact storage servers into HP MAS, enabling us to double the storage density in a rack.  That means storing up to 190 TB in a single rack now.  For you, this translates to less racks on the floor and lower power and cooling costs in your data center.  Another area where we help reduce storage costs and improve data center efficiency is by giving you the choice to integrate four storage tiers into HP MAS and manage it as one system.  The long retention durations in healthcare today make multi-tier integration from HP MAS a key enabler to reduce storage TCO and appropriately grow storage tiers.

So while you can’t rotate HP MAS on its side and expect it to display a scientific calculator, it is very simple and I can sum MAS 3.5 up in eight words: Twice as dense.  Half the data center drain.

"Information Democracy" Meets HP's MAS 3.5 Information Management Rules Editor

By Lisa Dali

Long-term archival storage of medical imaging data presents some big challenges.  Historically, healthcare IT departments used the average size of an X-ray (around 40 MB per study) as a metric in determining their storage needs.  However, in today’s world of advanced imaging technology, this metric is approaching 100 MB per study.  In addition, the fact that storage media is only about 25% of the total cost of managing information for the long-term is increasing the pressure on healthcare organizations.

Along with exploding study size comes the massive rise in the number of imaging studies performed annually.  One of the constants in healthcare is that the value of data changes over time, sometimes very rapidly -- yet we see the persistence of an information democracy that treats all data in the same way and is impacting total storage management costs. 

The solution:  Build an archival storage infrastructure that allows you to place data on the right storage tier for the appropriate amount of time and better utilize resources currently spent managing data.

With HP's Medical Archive solution (HP MAS) you can integrate and centrally manage four storage tiers (SAN, SAS, SATA, and tape).  With the new configurable Information Management (IM) Rules Editor built into HP MAS, you can develop automated policies that migrate data between integrated tiers, reducing the resource-draining manual processes employed today.  The IM Rules Editor, which has several other configurable capabilities, is fully automated by HP MAS and it allows you to grow storage tiers appropriately, aligning storage costs and retention policies with the business value of images.

Tune in next week to hear best practices for improving data center efficiency.

Access Medical Information Faster...


By Lisa Dali 

Healthcare is one of the fastest growing vertical industries worldwide (>13% CAGR).  It’s also one of the few vertical industries where insufficient and slow access to data can literally mean life or death.  Let’s face it, we’re all patients at one time or another, and, as the hospital gown-wearing crew, we demand the highest quality patient care. 

If a patient comes into an emergency room for acute treatment and prior data for comparative diagnosis isn’t available, patient safety is at risk.  Ultimately, our demands for quality of care place strong demands on our doctors and clinicians.  For them to meet our high standards, they tighten the tourniquet on IT and imaging departments by requiring the essential workflow enabler: Fast access.

Slow responses and long wait times are unacceptable.  These are daunting technical challenges to mitigate, especially with the significant growth in size and volume of annual medical imaging studies and retention requirements enforced at federal (e.g, HIPAA), state, and/or local levels.  In today’s dynamic healthcare world, how can organizations cost-effectively mitigate risks to patient safety and facilitate fast access to patient data?

HP has the solution.

The HP Medical Archive solution (HP MAS) is a specialized long-term archival storage appliance built to archive and rapidly retrieve diagnostic imagery data.  On July 14, 2008, HP released MAS 3.5.  One of the key new features is an enhancement to the existing fast cache capability, helping users further mitigate risks associated with fast access.  With this release, HP has significantly increased the cache size (up to 2 terabytes) and optimized caching of even larger images/studies (up to 60 GB in size).  As before, you can select data requiring cache for even faster access based on file type and data source, and this standard capability provides more control over which images clinicians have faster access to.  The significant cache size increase means many more images/studies will be rapidly available for patient care than ever before.

Large prioritized cache management is just one of the key features performed standard by HP MAS that enables IT and imaging departments to meet the demands of clinicians for fast access, and allows doctors and clinicians to remain focused on patient care.

In my next blog I'll share additional information re: the MAS Rules Editor and how this feature facilitates, among other things, auto-migration of data between tiers.

Showing results for 
Search instead for 
Do you mean 
About the Author(s)
  • This account is for guest bloggers. The blog post will identify the blogger.
  • For years I've been doing video and music production back and forth between Boston MA and New Orleans LA. Starting in 2010, I've began working with Vertica (now HP Vertica) in the marketing team, doing customer testimonials, product release videos, and website management. I'm fascinated by Big Data and the amazing things my badass team at HP Vertica has done and continues to do in the industry every day.
HP Blog

HP Software Solutions Blog


Follow Us
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation.