Information Faster Blog

Save $100 when you register now for HP's Information Management sessions at Software Universe

By Patrick Eitenbichler


HP Software and Solutions’ Information Management suite will be featured at the upcoming HP Software Universe 2010 in Washington DC, June 15 – 18th, 2010.


The IM suite, including HP Data Protector, HP Email, Database and Medical Archiving IAP, and HP TRIM records management software, will be represented in two tracks:



  • Data Protection

  • Information Management for Governance and E-Discovery


Customer case studies and presentations from product experts will highlight how HP’s Information Management solutions provide outcomes that matter. For more information about this event, or to register, please use code INSIDER at www.hpsoftwareuniverse2010.com and get $100 off the conference rate.

How three very different companies are managing rapid database growth

By Patrick Eitenbichler



Wanted to share three great customer success stories. The companies are very different from each other, but they’re all grappling with business challenges posed by surging data growth: meeting compliance obligations, controlling storage costs, and optimizing performance. The companies turned to HP Database Archiving software to solve these problems, and more.



Tektronix, a U.S.-based provider of test and measurement solutions to the electronics industry, improved application and database performance by more than 47%, and aced compliance tests in 29 countries, despite data growth of 1.25 GB per month.



Tong Yang Group, a Taiwanese automotive parts manufacturer, experienced data growth at a rate of 30-40 GB on average per month - impacting database performance and causing
user-related issues. Tong Yang saw an immediate 10% increase in efficiency in handling orders, and they gained the ability to support 7% business growth in 2009 despite the economic recession.



Turkey is both a private financial services company and the country’s central depository for dematerialized securities. The agency’s database grew 1000 times in a one-year period, due in part to industry regulations requiring financial services firms to store more data for longer periods of time. With HP Database Archiving software, the agency met its growing data archiving needs while reducing storage costs by 50%.’s Central Registry Agency



To learn more about how these companies overcame their database growth challenges, click on their corresponding names above.

EMC announcement: More like "PromiseOne"

On April 2, 2009 EMC finally announced the long-awaited replacement of EmailXtender.  No surprise.  Actually, it looks like they tried to announce it on April 1 and then pulled all the links—perhaps it was feared it would be seen as an April Fool’s joke.  What isn’t a joke is that this product, called SourceOne Email Management, is actually not a one-source archive solution—yet.  Like its predecessor, it does still archive one overall type of content: messaging.  EMC says that later this year they will release file, XML, and SharePoint archiving.  So, that’s when it will be “one source”?  Not exactly.  Why?  Because the SourceOne product family is not integrated.  Give them twelve to eighteen months—hey, they promised after all.

Bottom line: EMC’s announcement does not compare to the breadth and range of HP’s current offerings, and EMC is more than six months late to market with a product that does not even fulfill what they previously communicated to customers in terms of their key archiving needs.  Furthermore, the release of SourceOne Email Management is a replacement for EmailXtender, and what EMC is delivering with this release is a mere promise of what this product could become in the next year to eighteen months.  In these economic times, we need more than promises to show ROI like what HP IAP customers have been achieving for more than four years:

--Improving staff productivity by up to 80%, and email- and file-based productivity by over 34%


--Lowering email and document processing, review, and production costs by up to 90%


--Reducing time needed to analyze email and documents from weeks to minutes


--Achieving control of their corporate data, improving information governance

To Stub Or Not To Stub, That Is The Question…

By André Franklin


Whether 'tis nobler in the mind to suffer
The slings and arrows of convenient mailbox message stubs,
Or accept the clean simplicity of stub-less archiving…


… well… that is the question…


Ok…enough Shakespeare. What are you talking about?


When email archiving is performed to enable better email management, HP calls it selective archiving. In a word, selective archiving removes mail messages from mailservers. There are two popular selective archiving methods to manage mailservers:



  • with stubs

  • without stubs

So… what is a stub?


A stub is a substitute for a mailbox message that has been removed from a mailserver and placed into a dedicated archive. A stub contains a link to the original message and attachments that reside in the archive. The stub allows the original message and attachments to be retrieved from the archive through a user’s mailbox.


Only after messages are safely archived are they removed from the mailserver. Users remain within their mailbox quota limits as mailserver messages are deleted. This whole process improves email performance and reduces mailserver backup headaches. Assuming archive storage is lower cost per MB than Tier 1 mailserver storage, there are clear capital expenditure benefits for selective archiving.


(Note: archiving strictly for compliance purposes never uses stubs, but compliance archiving can be performed in addition to a selective archiving strategy).


What do I gain with each approach?


Stubbing:


A stubbed representation appears in the same place in a user’s mailbox as the message it represents. It allows for a single integrated list of both mailbox and archived messages. Stub messages are very small in comparison to the messages they replace. When used with policies that automatically remove, archive, and “stub” messages (often based on message age), users can experience a sense of “infinite mailbox”, and without the massive mailbox capacity that would give some mail administrators a heart attack.



Not-Stubbing:


There is no possibility or “stubbing” software causing problems with the mail client. Mail messages and messages classes are not modified. Archived messages that have been removed from mailserver mailboxes are presented to users as a special “archive folder” (no view of mailbox messages).


We’ll look at more of the benefits and “gotchas” of each approach in my next post.


 

Go Green; Retire Those Old Energy-Hogging Apps!

by Mary Caplice



I was reading an article today from Forrester (‘Q&A: The Economics Of Green IT’) about how companies can not only save money by going green, there may also be government incentives and utility company programs to help them.  They may even incur penalties in some regions for not going green in certain areas. This article suggests that there are very compelling reasons for IT leaders to educate themselves on local incentives and penalties.  Some green projects require an upfront cost that will pay off later, some require none.  For example, GE expects to save millions just by turning on Windows features like standby and hibernate!  IT can save capital and operating expenses, cooling costs, DBA time, facility square footage  and  license fees for both hardware and software by retiring applications that are being kept alive in case they’re needed down the road for regulatory and compliance reasons.   There’s even a secondary market for that retired hardware!  One way to go about application retirement is to invest in HP Database Archiving software (excellent ROI potential is discussed in my recent blog ‘Death and Taxes- Maybe one can be avoided’).


 

Ignore it but it won’t go away!

By Mary Caplice



Although we’re all experiencing the effects of a worldwide downturn in the economy, organizations are finding that there are certain things they can’t ignore and wait until the economy improves.  They’re finding that they have no choice but to invest time and technology in reducing costs and risks associated with both increasing data retention regulations and the ability to quickly and efficiently answer legal discovery requests.  This problem is of course most concentrated in highly regulated industries such as Insurance, Financial Services and pharmaceuticals.



HP Database Archiving customers are finding that investing in our technology in these areas can really pay off!

IAP Retention Management – Future Ideas

By Ronald Brown


Today, the HP Integrated Archive Platform (IAP) manages retention at the “archive level” – meaning the archive itself is not only responsible for executing the retention management functionality, but it is also the place where the retention settings are configured.  This means the “archive” administrator has the responsibility to maintain the system in accordance with a company’s record retention strategies.  This is certainly one approach, however, there are others which may give more flexibility.


As the number of applications that move data into an archive grows, it becomes more important to actually understand the business value of that data and to provide more flexible retention policies.   Perhaps the owners of the application data itself should be able to communicate their requirements to the IT personnel responsible for their data.  In this case the application itself should drive the retention policies.  This will help ensure that the retention policies are specific to the application and maintained by the application experts.  The archive itself will be the executor of these policies.  While this affords more flexibility, the downside is that it requires more attention in order to define these policies and maintain them – so sometimes a blanket policy works better, especially for customers who are reluctant to commit the time and effort involved in defining their corporate retention strategy in a granular manner.


Another interesting use case is where the archive only retains data that is under active investigation or discovery.  Here, the archive is loaded with, for instance, 3 years of corporate data.  Then, specific queries are performed and the resultant data sets are placed on hold.  After this process is completed, all data not on hold is released and removed from the archive.  This use case serves a specific customer base very well – even though it seems to defeat the intended purpose of the archive.


One can never “predict” what is best for a customer and how they will utilize their technology investments.  The key is to give enough flexibility so that all use cases can be explored.


 

Email Archiving: Choose Carefully…Very Carefully (Part 4)

 


By André Franklin


In part 3, we discussed seven principles. If the principles are observed, you are unlikely to ever have the need to migrate to a different archiving platform in the near future.


The seven principles are:




  1. Thoroughly understand your email environment


  2. Set clear archiving goals that will still make sense in 5 years or more


  3. Examine scalability in all dimensions


  4. Don’t treat email archiving as a silo. Consider other applications that need (or will need) data archiving


  5. Favor solutions built on standard interfaces for investment protection


  6. Backup and/or replication is more important than any other single feature


  7. Seek references of companies that have similar needs

We examined in detail principles 1 through 3 in part 3. Let’s examine a couple more principles in this post…


Don’t treat email archiving as a silo


We have heard from many users that email is the biggest pain with regard to implementing archiving. This applies to email archiving for compliance purposes, or simply to lighten the load on mailservers. As such, email is often the first archiving problem to be tackled. It’s a noble deed to take on the toughest problem first, but it’s not a wise deed if future archiving needs are not taken into consideration.


What will you need to archive in the future?  Most environments have files. Many use Microsoft Sharepoint to share departmental and corporate information and content. Then there are instant messaging systems, text messages, voicemail, and so on. There is also database data that can be selectively archived for improved database performance. To complicate matters, information management systems want to control what is stored, for how long I is stored, and who has access to the stored information. All of this must be taken into account when implementing an archive.


In an ideal world, one can perform a single search across a massively scalable archive to retrieve data of various types from email to media files to financial records, etc.


All future archiving needs should be considered at the time the first archiving problem is tackled. If an archiving solution does not address the breadth of application data that you want or will need to archive…you run the risk of trying to migrate your archive data to a new and scalable archiving platform in the future. As we have discussed in previous posts…”it’s ain’t gonna be pretty”…so make the right choices upfront.


Favor solutions built on standard interfaces for investment protection


Solutions built around standard interfaces mitigate certain risks with regard to data interchange -- if a migration ever becomes necessary. In addition to standard interfaces, solutions that expose well-documented API’s also mitigate risks. This allows you to roll your own solution and/or interface with other solutions and add-ons. You never really know everything you will want or need in the future, nor can you know of future products that will add value to your existing archiving investment. Standards and API’s help put the odds in your favor.


We’ll examine the remaining two of the seven principles in part 5 of this series.

Using Replication with the HP Integrated Archiving Platform

By Linda Zhou


The HP Integrated Archiving Platform (IAP) supports local and remote replication. There are two replication methods for copying data between two IAP systems: one-way replication and cross replication. These techniques are in addition to the disk-level mirroring built into IAP.


To illustrate how the replication works, let's look at two examples. Consider two IAP systems: one IAP system, IAP-USA, is in New York City, and the other IAP system, IAP-UK, is in London, UK.  We designate IAP-USA as the master and IAP-UK as the slave. User permissions are maintained at the master and replicated to the slave, for both one-way and cross replication.


One-way Replication


In this scenario, IAP-USA archives emails, but IAP-UK is dedicated only to replicated data from IAP-USA.


IAP-USA first archives emails into its Smartcells. These Smartcells are grouped in primary and secondary pairs. IAP-USA then sends its Smartcell data to IAP-UK to replicate the archived emails. IAP-UK has two options to store the replication data: one Smartcell or a pair of mirrored Smartcells. Email owners and compliance users can search emails in both IAP-USA and IAP-UK. This replication method is also called active-passive replication because IAP-USA is actively ingesting new emails and IAP-UK is passively replicating IAP-USA.


Cross Replication


In this scenario, both systems are archiving new emails, each replicating to the other system. For example, IAP-USA might archive the emails of North American users, while IAP-UK is responsible for archiving European users’ emails. The archived emails are stored in the primary and secondary Smartcells in IAP-USA and IAP-UK. Both email owners and compliance users can search their emails in IAP-USA and IAP-UK. This replication method is also called active-active replication because both IAP-USA and IAP-UK are actively ingesting new emails.


Replication Rates


The effective rate of replication is dependent on the rate of new emails being archived and the available network bandwidth between the two IAP systems. Because the peak time of new emails is during business hours, and less bandwidth may be available during that time, replication may fall behind. Generally, this should not be cause for alarm, as IAP will catch up during periods of reduced network traffic and email volume (e.g. overnight). However, if the replication backlog is consistently growing, the administrator should consider increasing the network bandwidth available for replication.


 

Email Archiving: Choose Carefully… Very Carefully (Part 3)

By André Franklin


In part 2 of this topic, we raised the question, “How DOES one choose carefully? We listed several basic principles to consider when choosing. If these principles are observed, you are unlikely to ever have the need to migrate to a different archiving platform.


The principles we listed were:




  1. Thoroughly understand your email environment.


  2. Set clear archiving goals that will still make sense in 5 years or more.


  3. Examine scalability in all dimensions.


  4. Don’t treat email archiving as a silo. Consider other applications that need (or will need) data archiving.


  5. Favor solutions built on standard interfaces for investment protection.


  6. Backup and/or replication is more important than any other single feature


  7. Seek references of companies that have similar needs.

Let’s examine a few of these principles in this post…


Thoroughly understand your email environment



Without a good understanding of your mail environment, you’ll play roulette when it’s time to purchase an archiving solution. You must know certain basics, such as: storage capacity in terabytes, messages per second, average message size, max message size, etc. You need to make sure the proposed archiving solution can keep up with traffic in your mail environment. The new solution must be of sufficient size to accommodate all of your terabytes of messages and attachments… with room for growth over the next several years.


Set clear archiving goals that will still make sense in 5 years or more



Are your goals strictly compliance or e-discovery oriented in which you will remove data from the archive after “x” number of years? Are you trying to offload data from mailservers…and expect archived data to grow and grow and grow over the years? Maybe you require all of the above?



If the solution you buy today is already designed to address the picture you have in 5 years… you have reduced much of the risk associated with the purchase of an archiving solution.


Examine scalability in all dimensions



One of the big mistakes made by many is that capacity equates to scale. Just because a storage or archiving device can store the amount of data you want does NOT mean you’ll be able to easily retrieve it when needed. Archives must be designed to ingest data quickly (keep up with traffic in your mail environment)…while allowing rapid search access to archived data. Message retrieval using search can be hours or days with some solutions…and may be quite unacceptable if a judge is waiting for you to produce information based on a number of search parameters.


Before you buy… examine scalability in ALL dimensions:




  • Search performance (retrieval time) for the environment size you envision in 5 years


  • Archiving performance (the number of “typical” messages per second that can be archived)


  • Capacity (the number of terabytes of messages the archive can hold)

Stay tuned…more on these seven principles in part 4 of this series…

Archiving Databases: Throwing the Baby Out with the Bathwater

By Mary Caplice


When archiving data from relational databases for either compliance or performance reasons, it is standard practice to archive at the business transaction level of granularity rather than at the table, block or partition level.  In most cases it’s very straightforward to model a transaction for archiving so that the transaction is moved intact without leaving parts of it behind, except where there are many-to-many relationships where transactions become ‘chained’ to other transactions.


For example, you could have an application containing customers, invoices and payment information. At first it may look like all invoices with a status of ‘CLOSED’ older than one year can be archived. However, several payments can be linked across invoices, and in some cases partial payments are keeping some of these invoices ‘OPEN’. All invoices across the chain of payments must be considered. Any part of a chain that is open should disqualify the entire chain. Without support for chaining the integrity of the application can be compromised.


Unless your archiving solution can chain these transactions together correctly, you could be left with dangling transactions! 


 

Archiving for the Clouds!

by Janani Mahalingam


What does Cloud Computing mean? What does it mean for archiving world?


Cloud computing is the concept where you get IT resources by just turning a knob - just like you get hot water or cooking gas whenever you want by turning a knob. These resources are always available and you will be charged only by the amount that you use. Hmm... isn’t that an interesting concept for IT? It may be storage resource, servers, databases, software resource, # of disks, # cpus, anything that IT has to deal with - Now it is done in the cloud. It is the headache of the cloud maintainers to deal with it. There are a few providers such as HP AIaaS(Adaptive Infrastructure as a Service), Amazon, Google and others who have come up in the last couple of years.


What does Cloud computing mean for Archiving world? Once everything is centralized, there is a necessity for all the clouds to look into archiving in order to ease their maintenance pain. Archiving provides a number of advantages such as performance improvement, cost control of expensive storage, compliance, control of legal hold issues and so on. Every cloud will look into archiving solution once they start building their customer base.


Next step - Archiving solution for the Clouds...

Inquiring Minds want to know about Database Archiving – Part II

By Kevin O’Malley


As an add-on to my last blog here are additional questions that I’ve been asked about recently in regards to database archiving projects.


Q6- This isn’t rocket science. Why do I need a software package for database archiving?
A6- Spoken like a DBA who’s got some extra time on their hands – just kidding. At first blush it does sound ‘easy’ to extract data to archive tables. Many of our existing customers have attempted custom archiving projects. What most organizations find out, either during the project or soon after go-live, is that archiving requires maintenance and upkeep. To gain the full benefits archiving needs part of ongoing retention management. Application environments can be very dynamic, another factor making it harder than it seems. Additionally, archiving projects are only as successful when archive access enables business users to loosen their stance on “I need all of my data in the production application”. You need to invest heavily in this area otherwise users won’t let you archive a single piece of data.


Q7- Can’t I just extract data from my application databases and put it in my data warehouse?
A7- Yes, you most certainly can do this. ETL tools are great at this and you’re most likely feeding some of the data (summarized or in detail) to the DWH already. What ETL tools were not designed for is deleting records from the source. You still need to retain the source ‘record of truth’ somewhere. This is usually the biggest hang-up. If you’re doing ‘performance archiving’, trying to slim down your production environments, you need to move (copy and delete) entire transactions from the source and provide full access to data. If your only goal is to offload reporting from the production system then a DWH approach may fit the bill. Why not do both?


Q8- We don’t have time for new maintenance activities – does database archiving have to fit into my downtime window?
A8- No. HP Database Archiving allows you to run archive cycles with the database up and application users online. This is because we move data as complete transactions to retain database and application consistency at all times. Archiving jobs can be run at any time that fits into your normal operations, similar to running your batch reports. You can even run jobs for a set amount of time and simply terminate, thus archiving as many transactions as you could in the allotted window. Both the production system and archive remain in a consistent state. This makes it easy to archive whenever it fits into your schedule.


Q9- How long does it take for archiving to run?
A9- I’m going to give an ‘it depends’ answer here. It depends on your server environment, storage etc. Moving millions of rows in minutes is pretty common, even over networks. Moving millions of referentially intact data sets and performing atomic deletes (all or nothing) gets a little more involved. After all, this is your production system – there’s no room for error (as described above). HP’s standard approach for data movement is to move data transactionally, safely and reliably. There’s no coding involved – it’s built into the run-time engine. Additionally, HP Database Archiving achieves high transfer rates using configurable parallel workers. Based on your specific environment you can configure the number of workers to maximize throughput.


Thanks again.


 

Implementing a Global Database Archiving Strategy

by Kevin O'Malley 


 


Recently, HP hosted a customer case study with Tektronix that detailed their global database archiving strategy. The following is an interview style blog with Lois Hughes of Tektronix on some of the key elements of their implementation. If you'd like to view the complete webinar please go to the following - Tektronix Case Study Webinar.


 


Lois, what are the guiding principles for Tektronix when it comes to information management?


 


We have three core tenets when it comes to this. First, business data must be viewed as information asset and like any asset it must be managed by its 'useful life'. Documenting the useful life or retention must be well understood by both business and IT. These assets can become liabilities if they are kept beyond its useful life - in other words, don't keep data longer than legally required.


 


So, sounds good - how do you accomplish this?


 


At Tektronix we take our application data through three phases to manage its lifecycle. The three phases are called Current State, Long-term Retention and Final Form. Depending on the application and data type all transactions must flow through these in succession and ultimately deletion. Current State is the only part of the lifecycle in which data is transactive and actively updated - the other phases are read-only/reporting phases to meet business and audit requirements.


 


How does establishing these retentions help Tektronix?


 


We do business in many counties around the world - 29 to be exact. Understanding regulations down to the country level is critical. We created Central Retention Document that lays out our retention polices for every country and region around the world. For example, we need to keep 7 years of financial data in the US whereas in China the requirement is 10 years. We keep track of changes to laws and regulations in this document.


 


How do you enforce your retention policies?


 


All of the business owners must understand how retention impacts their areas and access needs. Ultimately, however, we needed a solution to help us enforce policies and manage data through the entire lifecycle. HP Database Archiving software was a great fit for us. HP's solution allows us to migrate database transactions to secondary archive databases as well as long-term XML data stores. The partnership has been a great fit for Tektronix and we're very pleased with the results.


 


To find out more about the Tektronix implementation and HP's solution register for the webcast replay (Tektronix Case Study Webinar).  You can post questions/comments to Lois or myself here as well.


 

Economic melt-down, E-discovery and the need for grid based, high performance archiving

By:  Randy Serafini


It's virtually impossible today to discuss business with out the subject of e-discovery becoming a hot topic.  Particularly when one examines the current economic climate and how we got here, it will only be a matter of time before we see a new set of tighter, more rigorous, more comprehensive, more oversight rules, and more regulations the likes of Sarbanes-Oxley, SEC, and FRCP (Federal Rules of Civil Procedure) just to name a few.  And I suspect when the dust starts to settle, we will witness a wave of law suits the likes of which we've never seen before.


This onslaught of new or updated regulations will most likely demand that information (in this case, potential evidence) be automatically archived to a tamper-proof environment and protected for extended periods of time.  It wouldn't be too much of a stretch to imagine that even more content types, beyond email and files, will need to be preserved for periods longer than those currently specified.  In fact, due to the growing need to preserve and search (e-discovery) these volumes of information to meet regulatory and discovery demands, archiving is fast becoming a mission critical application. 


In the case of the HP Integrated Archive Platform, it offers a core technology based on the combination of a Universal Smartcell and a grid-architecture which can provide a very linear performance profile as capacity, users, and the numbers of objects are added.   The Smartcell is a finite storage component that not only stores the object itself, but indexes the content and meta-data of that object.  The Smartcell then leverages the grid-architecture in the IAP which allows for the unique capability to distribute indexing and search across many Smartcells eliminating bottlenecks.  This approach to archiving results in superior performance that can scale to 100,000s of users, billion of objects, and to nearly a half a petabyte of useable storage.


As many previous generations of archiving solutions have grown quite large, their search performance has become seriously degraded making fast response to e-discovery requests pretty unnerving when time is of the essence and a deadline is looming.  The IAP with its integrated software and hardware grid based archiving architecture, is designed with the capability to scale archive performance, particularly search performance, which means that Compliance officers, General Counsels, legal teams, and IT departments can provide fast response to the mounting compliance and e-discovery requests in a timely fashion. 


Fast e-discovery response from a grid architecture with distributed indexing and search, can provide faster case assessment, faster discovery to ensure deadlines are met to avoid fines and sanctions, and take a tremendous burden off of the IT department to support e-discovery needs whose resources are already taxed. 


For e-discovery, faster is better.  Faster at mitigating risks.  Faster at reducing costs.

Is your database data secure?

Corporate data theft makes huge headlines like the TJ Maxx incident where 45.7 million credit and debit card numbers were stolen. In addition to debit and credit card numbers about a half a million customers had their personal information (SSN, address, phone etc.) stolen. This was a premeditated crime by outside hackers who went out of their way to breach security, including hacking through encrypted data overall several months. While these events will always make the headlines the threat from lack of internal security policies and controls is by far the weakest line in your data security defense. Forrester estimates that 80% of security breaches are from insiders – this includes employees and others with access from within the organization.

 

What kind of data is being managed in your enterprise databases? Employee personal data is typically stored in HR/Payroll systems, customer data in billing systems, AR and order management systems just to name a few. IT staff have the responsibility of managing the infrastructure and in some cases have direct database level access to perform system management functions. Database Administrators (DBAs) in particular can be given ‘the keys to the kingdom’ if the right checks and balances are not in place.


Most of these internal security weaknesses are overcome by narrowly defining roles combined with the right controls and oversight.

 

What’s wrong with this picture? Most of the effort is focused on production systems/databases and can be very lax in test and development environments. In some cases developers and testers need sample data that exactly mirrors production data. The easiest way to re-produce a production system is to clone (copy) the entire database. Passwords and access in dev/test systems tend to be much more open then in production. The scariest part is that most breaches here can go undetected. For example, what if someone can’t fight the temptation of looking up their manager’s salary or that of another employee? No one will ever know.

 

Non-production databases used for test, development, training etc. require just as much oversight as their production counterparts (if not more). When clones of production environments are required the best thing to do is incorporate data subsetting and data masking into the database creation process. Subsetting reduces data set volumes in a way so that data and application integrity is maintained and the sampling of data allows all the required tests to be performed. Subsetting doesn’t sound like a security function but it may be integral to your overall strategy. For example, financial information or sales order data might be very sensitive data, especially for public companies.  Removing current year transactions from non-production databases is a very valuable way to subset away potential breaches. Masking is the process of changing or substituting certain data values so that they become meaningless. In the example above, not only would the employee’s salary be masked, the employee’s personal identifiable information (PII) would be changed as well.

 

HP has the products and services to help you maintain your production and non-production databases as well as meet compliance requirements. Please check out HP Database Archiving on the Information Management Hub.


 

"Your mailbox is almost full" - Performance archiving and the infinite mailbox

by Randy Serafini


"Your mailbox is almost full" can become a thing of the past.  Studies show that nearly 65% of all enterprises still enforce mailbox quotas.   Resulting in significant productivity losses with users wasting enormous amounts of time managing their mailboxes just to stay operational.  With 85 million emails sent daily, and the 3,500,000 TB of email sent annually, its not going to get any easier.  However, with an enterprise archiving foundation in place many things become possible.


Performance archiving can seriously help to improve email performance, availability, reduce or defer infrastructure costs, and improve user productivity by creating the "infinite mailbox". 


The concept is really quite simple.  Move 75-80% of the email stored on your email servers to a secure enterprise integrated archive platform, and then delete it.  Removing all of this email cholesterol has many significant and immediate benefits:




  • Boost email performance... Remove large volumes of email off of the primary email servers and storage.

  • Faster performance can defer the cost of server upgrades... Customers have reported deferred Capex upgrades by 12-18 months. (This is particularly nice with shrinking budgets in a tight economy!)

  • Recover expensive primary storage capacity... De-duplication efficiently stores email on the archive

  • Faster backup times and reduced costs... Expand your backup window and reduce the need for all those expensive backup tapes.

  • Faster recovery, migration, and upgrades... Thinking about upgrading to Exchange 2007?

  • End user productivity soars with the "infinite mailbox"... This is one of my personal favorites and of our customers using this feature!

Using the many policies in the Selective Archiving modules available in HP Email Archiving software along with the HP Integrated Archive Platform, email can be automatically migrated to the archive where the body, meta data, and any attachments are all fully indexed for future reference and two copies are securely stored with a unique digitally encrypted signature for retention management and to prevent tampering.  The email is then deleted from the email server and replaced with a pointer or "stub" that allows the user to easily and quickly retrieve the email from the IAP... without the need for IT intervention!


The latest feature just released in Email Archiving software is "Quota Archive Threshold" which sets the upper and lower threshold limits for a mailbox.  This "set-and-forget" feature simplifies the administrators' job and virtually guarantees the user will never see a full mailbox again.  Access to the archived email couldn't be easier or more intuitive to find and retrieve by using the integrated or Web UI search.


The key HP components for and end-to-end Performance archiving solution are:




  • HP Integrated Archive Platform or IAP (factory integrated)

  • HP Email Archiving software - Selective Archiving module (Exchange or Domino)

  • HP Email Archiving Gateway (Exchange or Domino)

It's not too often that such a simple concept as archiving can have such a profound effect on how the enterprise manages the accessibility, cost, and risk of information.   Maybe that's why archiving is going to be the next big thing.

Search
Showing results for 
Search instead for 
Do you mean 
About the Author(s)
  • This account is for guest bloggers. The blog post will identify the blogger.
  • For years I've been doing video and music production back and forth between Boston MA and New Orleans LA. Starting in 2010, I've began working with Vertica (now HP Vertica) in the marketing team, doing customer testimonials, product release videos, and website management. I'm fascinated by Big Data and the amazing things my badass team at HP Vertica has done and continues to do in the industry every day.
HP Blog

HP Software Solutions Blog

Featured


Follow Us
Labels
The opinions expressed above are the personal opinions of the authors, not of HP. By using this site, you accept the Terms of Use and Rules of Participation.