Archive for the ‘government’ Category

GDPR – Top 10 Things That Organizations Must Do to Prepare

May 25, 2018 – that’s probably the biggest day of the decade for the universe of data on the Internet. On this date, Europe’s data protection rules –  European General Data Protection Regulation (GDPR) – becomes enforceable. In 2012, the initial conversations around GDPR began, followed by lengthy negotiations that ultimately culminated in the GDPR proposal. At the time of writing this guide (Sep 2017), most European businesses have either started making first moves towards becoming compliant with GDPR, or are all set to do so. Considering how GDPR will be a pretty stringent regulation with provisions for significant penalties and fines, it’s obvious how important a topic it has become for tech-powered businesses.

Now, every business uses technology to survive and thrive, and that’s why GDPR has relevance for most businesses. For any businessman, entrepreneur, enterprise IT leader, or IT consultant, GDPR is as urgent as it is critical. However, it’s pretty much like the Y2K problem in the fact that everybody is talking about it, without really knowing much about it.

Most companies are finding it hard to understand the implications of GDPR, and what they need to do to be compliant. Now, all businesses handle customer data, and that makes them subject to Data Protection Act (DPA) regulations. If your business already complies with DPA, the good news is that you already have the most important bases covered. Of course, you will need to understand GDPR and make sure you cover the missing bases and stay safe, secure, reliable, and compliant in the data game. Here are 10 things businesses need to do to be ready for GDPR.

Top 10 things that organizations should do to prepare and comply with GDPR

1.      Learn, gain awareness

It is important to ensure that key people and decision makers in your organization are well aware that the prevailing law is going to change to GDPR. A thorough impact analysis needs to be done for this, and any areas that can cause compliance issues under GDPR needs to be identified. It would be appropriate to start off by examining the risk register at your organization if one exists. GDPR implementation can have significant implications in terms of resources, particularly at complex and large organizations. Compliance could be a difficult ask if preparations are left until the last minute.

2.      Analyze information in hand

It is necessary to document what personal data is being held on hand, what was the source of the data, and who is it being shared with. It may be necessary for you to organize an organization-wide information audit. In some cases, you may only need to conduct an audit of specific business areas.

As per GDPR, there is a requirement to maintain records of all your activities related to data processing. The GDPR comes ready for a networked scenario. For instance, if you have shared incorrect personal data with another organization, you are required to inform the other organization about this so that it may fix its own records. This automatically requires you to know the personal data held by you, the source of the data and who it is being shared with. GDPR’s accountability principle requires organizations to be able to demonstrate their compliance with the principles of data protection imposed by the regulation.

3.      Privacy notices

It is important to review the privacy notices currently in place and put in a plan for making any required changes before GDPR implementation. When personal data is being collected, you currently need to provide specific sets of information such as information pertaining to your identity and how you propose to use that information. This is generally done with a privacy notice.

The GDPR requires you to provide some additional information in your privacy notices. This includes information such as the exact provision in the law that permits asking for that data and retention periods for the data. You are also required to specifically list that people have a right to complain to the ICO if they believe there is a problem with the way their data is being handled. The GDPR requires the information to be provided in the notices in easy to understand, concise and clear language.

4.      Individual rights

You should review your procedures to confirm that they cover all the individual rights set forth in the GDPR. These are the rights provided by the GDPR.

  • To be informed
  • Of access
  • To rectification
  • To erasure
  • To restrict processing
  • To data portability
  • To object
  • To not be subject to automated profiling and other such decision-making

This is an excellent time to review your procedures and ensure that you will be able to handle various types of user requests related to their rights. The right to data portability is new with the GDPR. It applies:

  • To personal data provided by an individual;
  • When processing is based on individual consent or to perform a contract; and
  • Where processing is being done by automated methods.

5.      Requests for Subject access

You would need to plan how to handle requests in a manner compliant with the new rules. Wherever needed, your procedures will need to be updated.

  • In most of the cases, you will not be allowed to charge people for complying with a request
  • Instead of the current period of 40 days, you will have only a month to execute compliance
  • You are permitted to charge for or refuse requests which are apparently excessive or unfounded
  • If a request is refused, you are required to mention the reason to the individual. You are also required to inform them that they have the right to judicial remedy and also to complain to the correct supervising authority. This has to be done, at the very latest, within a month.

6.      Consent

It is important to review how you record, seek and manage consent and if any changes are required. If they don’t meet the GDPR standard, existing consents need to be refreshed. Consent must be specific, freely given, informed, and not ambiguous. A positive opt-in is required and consent cannot be implied by inactivity, pre-ticked boxes or silence. The consent section has to be separated from the rest of the terms and conditions. Simple methods need to be provided for individuals to take back consent. The consent is to be verifiable. It is not required that the existing DPA consent have to be refreshed as you prepare for GDPR.

7.      Aspects related to children

It would be good if you start considering whether systems need to be put in place in order verify the ages of individuals and to get consent from parents or guardians for carrying out any data processing activity. GDPR brings in specific consent requirements for the personal data of children. If your company provides online services to children, you may need a guardian or parent’s consent so as to lawfully process the children’s personal data. As per GDPR, the minimum age at which a child can give her consent to this sort of processing is set to 16. In the UK, this may be lowered to 13.

8.      Aspects related to data breaches

You should ensure that you have the correct procedures necessary to investigate, report, and detect any breaches of personal data. The GDPR imposes a duty on all companies to report specific types of data breaches to the ICO, and in some situations, to individuals. ICO has to be notified of a breach if it is likely to impinge on the freedoms and rights of individuals such as damage to reputation, discrimination, financial loss, and loss of confidentiality. In most cases, you will also have to inform the concerned parties directly. Any failure to report a breach can cause a fine to be imposed apart from a fine for the breach by itself.

9.      Requirements related to privacy by design

The GDPR turns privacy by design into a concrete legal requirement under the umbrella of “data protection by design and by default.” In some situations, it also makes “Privacy Impact Assessments” into a mandatory requirement. The regulation defines Privacy Impact Assessments as “Data Protection Impact Assessments.”’ A DPIA is required whenever data processing has the potential to pose a high level of risk to individuals such as when:

  • New technology is being put in place
  • A profiling action is happening that can significantly affect people
  • Processing is happening on a large set of data

10.  Data protection officers

A specific individual needs to be designated to hold responsibility for data protection compliance. You must designate a data protection officer if:

  • You are a public authority (courts acting in normal capacity exempted)
  • You are an institution that carries out regular monitoring of individuals at scale
  • You are an institution that performs large-scale processing of special categories of data such as health records or criminal convictions

Many of GDPR’s important principles are the same as those defined in DPA; still, there are significant updates that companies will need to do in order to be on the right side of GDPR.

Author: Rahul Sharma




Right Data Storage For Government Organizations: Public vs. On-Premise

self host vs public

Largely fueled by the US Federal Cloud Computing Strategy, many governmental institutions and organizations have been gradually migrating to the cloud. The cloud has not only proven to be beneficial to businesses, but also governmental organizations as they seek to provide improved services to its citizens. This has seen the government invest immensely, which according to the IDC’s projected spending, is currently at $118.3 million on public cloud and $1.7 billion on private cloud.

By the year 2012, more than 27% of the state and local government organizations had already adopted the cloud. After the Federal Data Center Consolidation Initiative started gaining traction in 2011-2012, the number increased tremendously to more than 50% by 2014. Although it doesn’t strictly limit the organizations to the cloud, the initiative was enacted as a strategy to reduce government expenditure and improve the quality of services rendered to citizens through the consolidation of data centers. All the data warehouses were merged into single data centers, with some government agencies opting to move to the cloud and others relying on single consolidated in-house data centers. As a result, a lot of underutilized real estate was saved, expenditure reduced, and according to the MeriTalk report, “The FDCCI Big Squeeze”, government IT staff and resources were re-assigned to more critical tasks.

What is the right data storage For Government organizations? public vs. on-premise. Although both approaches have proven to be beneficial, it’s important to critically compare them. This post covers some of the considerations in selecting right data storage for government: Security, Scalability, Costs, Service Delivery.  Government institutions are unique and they will ultimately strategize more efficiently based on their respective data-storage plans.


In an era where cyber warfare is the most widely preferred method of engagement, government data centers face significant threats not just from domestic but also foreign hackers. One of the most recent major attacks was directed at the database. The site, which was initially attacked in 2013, fell victim to hackers again in July 2014, who targeted a test server. Although no private data was lost, the attack shook Americans by exposing the federal system’s vulnerabilities.
To prevent a recurrence, the government has been taking data security very seriously by leveraging solutions which strictly adhere to security compliance. Most of the least sensitive data is transmitted through and stored within the cloud, which is managed by dedicated teams of third party security experts. The most sensitive data on the other hand, is best stored in in-house data centers, where access is strictly controlled by security agents, and regulated through anti-theft software solutions. A good example is the expansive NSA Data Center in Utah, which is tightly secured and can only be accessed by high ranking NSA officials.

Service Delivery

The type of services a governmental organization deals in significantly dictates the most efficacious delivery and storage infrastructure. Services like healthcare and tax filing, which are largely citizen-centered, are best delivered through the cloud- using websites and applications which are available to all citizens. Through this strategy, the government has managed to reduce queues within its central stations and facilitate remote access of government resources, even to citizens in the diaspora. Additionally, moving apps to the cloud has helped cut the annual expenditure by 21% among the affected organizations.
Some government organizations on the other hand, deal with data and services which are agent-centered and are only open to government agents. They can therefore conveniently use in-house solutions and avoid deploying sensitive apps and data to the cloud. The NSA again, falls in this category since it uses dedicated servers (as opposed to the cloud) to store and exclusively distribute most of the data among its agents.


Setting up, deployment, operational and maintenance costs are regarded as the most critical factors for evaluating the suitability of solutions for all types of organizations- business and government. On average, organizations use about 5-20% of the total lifecycle capital (depending on the scope of the respective data centers) to set up the necessary infrastructure- facilities, networks, devices and servers.  50-80% of the operating costs are expenses which cater for maintenance, upgrades, real estate rates, etc. Hence on-premise implementations could have very different cost profile based on the complexity and scale. On the other hand,  the cost for cloud data storage is linear since the logistics are indirectly outsourced to managed service providers. In many cases, adding an on-premise solution could cost significantly less for a organization that has already successfully set up a stable and comprehensive in-house data center.


Both in-house and cloud solutions are scalable- but the former is a more complicated process compared to the latter. To upgrade or scale up in-house solutions, governmental organizations have to acquire all the requisite equipment to support their respective upgrades, plus labor to oversee and manage the entire process. Additionally, they are compelled to commit more real estate to expand their server rooms to accommodate larger data centers as the number of servers increase. Evidently, this process could be time consuming compared to the much simpler process of up-scaling public cloud storage. Public cloud servers are inherently elastic and can accommodate infinite scalability according to the growth of an organization.

Hybrid Storage Option

With 70% of organizations (including governmental organizations) reportedly using or evaluating hybrid solutions, it’s regarded the most effectual and convenient solution. It allows organizations to leverage both in-house and cloud storage solutions to take advantage of both sets of benefits. To use it in your government organization, you should first evaluate all your resources according to the benefits of both in-house and cloud data storage options. Your findings should be subsequently used to move only the most critical services or applications to the cloud and retain the rest within your data center for complete control.

Author: Davis Porter
Image Courtesy:, jscreationzs

Top Cloud Security Trends for Government Organizations

goverment security trend

According to a report by the Rand Corporation, the cyber black market is progressively growing- hackers are now more collaborative than ever and consistently use sophisticated strategies to target and infiltrate data centers. In the past, they were driven by sheer notoriety and malice to attack data centers and ultimately prove their maneuver skills to their peers. Unfortunately, the trend gradually changed, and hackers are now driven by warfare agendas and the increasingly developing black market, where they sell valuable information to the highest bidders.

Of course their biggest preys are government data centers, which are particularly targeted by cyber armies with agendas against their respective target nations. In fact, governments now face more potentially damaging risks from cyber warfare than the regular type of engagement- In the former, a single individual with just a computer could successfully launch an attack against major government cloud databases, cripple them, and cause significant socio-economic damages. One of the most recent attacks was directed at Iran’s nuclear centrifuges, where the attackers used the simple “Stuxnet” virus to harm more than 20% of their installations. Under the cover of different agendas, an Iranian hacking group also recently went on a cyber-attacking spree dubbed “Operation Cleaver”, which ultimately damaged essential government infrastructure in more than 16 countries.

According to experts, this is only the beginning. Through a research conducted by the Pew Research center, 61% of them believed that a well-planned large-scale cyber-attack will be successfully orchestrated before 2025, and consequently severely harm the nation’s security. With such threats looming, it is essential for the government to implement the most efficient developing security technologies into their cloud. Some of the current top trends include:

Improved Access Control

Many of the successful attacks sail through because of poor access controls in the targeted data centers. Although not a government corporation, Sony’s recent problems, which even drove the government to intercept, were caused largely due to careless password and username usage. To prevent such attacks, the government organizations are now opting for advanced authentication processes to access their cloud resources. In addition to the standard two-factor authentication which grants access after verifying IP and username, the organizations are now implementing biometrics and secondary devices verification in their access control architecture.

Sophisticated Encryption

To make data useless to hackers when they infiltrate data centers or spoof during transmission, government organizations have been encrypting their data. Unfortunately, this has proven ineffective to hackers who steal decryption keys or use sophisticated decryption algorithms to unfold and obtain data. To prevent future recurrences, government organizations are stepping up their data-at-rest and data-in-transit encryption systems.

Through the years, they have been using two factor encryption systems where cloud servers and endpoint user hold the encryption keys. This is gradually changing thanks to automated encryption control which get rid of the user factor in the equation. Instead of distributing encryption keys to the individual users, the systems use array-based encryption which fragments the data during storage and transmission. The meaningless fragments are transmitted individually and can only be fully defragmented into meaningful data if the server or endpoint device detects all the fragments. Therefore, hackers can only spoof on meaningless data fragments.

Digital Risk Officers

According the Gartner Security and Risk Management Summit of 2014, the year 2015 will see a proliferation of digital risk officers. In addition to tech officers, enterprises and government organizations will now hire digital risk officers to critically assess potential risks and strategize on cloud and data security.

This has been necessitated by continued expansion of the government digital footprint, whereby its organizations are now widely integrating their systems with employee BYOD to improve service delivery. As the network and infrastructure grows, so do the risks- which now require dedicated experts to prevent them from developing into successful attacks. With the trend only picking up in 2015, Gartner predicts it to exponentially grow over the next few years depending on the expanding networks of various organizations. In 2017, the adoption of DROs by government organizations is expected to be at 33%.

Multi-Layered Security Framework

Since the cloud systems are composed of various elements which face different threats, government organizations are protecting their data through tiered multi-layered security frameworks. For a hacker to gain access to any of the systems, he has to first go through a sophisticated security model composed of several detection, resistance, defense and tracking layers.

In addition to the network firewalls, the government is using virus-detection and anti-spyware on its servers and storage systems to comprehensively protect server operating systems, endpoint devices, file systems and databases and applications.

Big Data Security Analytics

“Without big data analytics, companies are blind and deaf, wandering out onto the web like deer on a freeway”- Geoffrey Moore, author of Crossing the Chasm, indicated as he emphasized the need of implementing big data analytics in all the relevant departments in an organization, especially the web.

Government organizations are directly adopting this principle by embedding critical big data security analytics in their security frameworks. This allows them to continuously monitor data movement, exchange and potential vulnerabilities which hackers and malware could capitalize on. Additionally, data that is generated is comprehensively analyzed to gather intelligence on internal and external threats, data exchange patterns, and deviations from normal data handling. Due to its efficacy in analyzing potential threats and blocking them, Gartner predicts that 40% of organizations (both government and non-government) will establish such systems in the next five years.

Although no strategy is regarded absolute and perfect, these current trends are expected to streamline the cloud sector and offer government organizations increased security compared to previous years. This, with time, is expected to significantly reduce the number of successful attacks orchestrated on governmental cloud resources.

 Author: Davis Porter
Image Courtesy: Stuart Miles,

Why Government Institutions are Increasingly Switching To Private and Hybrid Cloud

government hybrid cloud
The year 2010 proved to be revolutionary to government CIOs since it oversaw the passing of the US Cloud First Policy, which significantly affected federal agencies’ IT frameworks. It was principally aimed at increasing flexibility and reducing IT costs by leveraging government-optimized cloud solutions. Since then, government CIOs have been comprehensively reevaluating their IT infrastructures and budget to include the cloud and subsequently improve agency operations and service delivery.

Due to limited capabilities and increased risk within the public cloud, most government institutions have since integrated the private cloud within their computing architectures to form stable and efficient hybrid cloud systems. A survey conducted by Meritalk on more than 150 government IT professionals found that although private cloud is the most preferable option, agencies still leverage the public cloud particularly for public-focused operations- consequently, over 30% of them have merged the two into hybrid cloud to concomitantly optimize agency and public-focused operations.

With about 56% of federal institutions currently drawing up plans to move to the cloud, the hybrid still remains the favorite due to the following reasons:


Annual reports prepared and produced by the Identity Theft and Research Centre suggest that cloud data centers are consistently targeted and intermittently attacked by malware and hackers, making them the most threatened data centers. With the recent data breach trends further substantiating this point, federal CIOs have grown exceedingly afraid of the public cloud. Through another Meritalk survey, 75% of them reported that they were hesitant of moving to the cloud to due data security concerns. Additionally, 30% of them were restricted from migrating their data centers to the cloud by the US data security compliance requirements on sensitive governmental data.

The most efficacious solution to overcome these issues is the hybrid cloud- While some applications run on public cloud environments, the most sensitive ones are restricted within secure private cloud data centers. Compared to the former, the latter solution is definitely more secure because of controlled access and user-defined data protection strategies. The hybrid cloud therefore allows government institutions to reap the benefits of the public cloud while enjoying data security privileges of the private cloud.

Cost Efficiency

Cost efficiency is arguably the biggest trending word in most government organizations today. The government is permanently under pressure from US citizens to reduce operational costs and account for every dollar spent within federal agencies. In reaction to this, the government enacted the Federal Data Center Consolidation Initiative to reduce IT expenditure by merging data centers through sustainable and effective cloud solutions- ultimately cutting costs on manpower, hardware, software and underutilized real estate.

Of course for most government institutions, migrating entirely to the public cloud would be detrimental to their operations. Therefore, to reduce costs and maintain optimal agency processes, they have connected dedicated or private cloud resources to public components- Ultimately reducing IT costs by 17%.


Government institutions are not static- they keep changing to adopt to demanding and dynamic societal needs. Some agencies develop exponentially until they are split into several factions with defined responsibilities. Others shrink and gradually become redundant until they are dissolved. To adopt to such fluid environments, standard in-house IT resources would definitely require drastic infrastructural changes.

With the hybrid cloud, agencies can effortlessly adjust their public cloud resources to accommodate changes while they maintain sensitive information within their private infrastructures. In case of agency splitting for instance, cloud resources are divided accordingly between the resultant factions while private in-house information is shared.

Service Delivery

The principal objective in federal agencies has always been public service. While some do this indirectly, most of them serve the public directly through citizen interaction. Service delivery for such institutions has been a major problem due to a large disparity between the number of agents and citizens. Over the previous years, many technological inventions have been implemented to overcome this problem by improving speed and efficiency, ultimately boosting service delivery.

Of all the IT solutions implemented, nothing has proven to be as phenomenal as the private plus public hybrid cloud. While services are being contemporaneously delivered to different citizens through public cloud, sensitive agency information and processes are supported by private cloud. This has virtually reduced agent to citizen ratio and consequently boosted service delivery.

Asset Utilization

For optimal service delivery, government institutions always have to effectively utilize close to 100% of their IT resources. A simple bug or hardware failure could cause a major setback, subsequently backlogging most of an agency’s operations. A complete system failure is even more detrimental since it could potentially cripple service delivery and trigger immense socio-economic losses. Case in point are the breaches which occurred in 2013 and 2014, partly crippling the US federal healthcare data systems and consequently deterring efficient service delivery.

For many years, such system failures were hardly manageable since fallback plans mainly consisted of analogue systems of paperwork. With the hybrid cloud however, optimal asset utilization allows federal institutions to switch between different cloud resources according to their availability, efficacy and suitability in handling different processes. If an in-house software fails for instance, one can simply temporarily shift to public cloud resources to proceed with the processes as maintenance and repair is done on the affected resources. Hybrid cloud systems are therefore widely considered in the implementation of strategic data loss disaster management frameworks.

Due to these significant benefits, the governmental cloud curve is considered to be steepest on the private and hybrid cloud as more institutions rush to adopt it over the next few years. Although many are considering trying out the public cloud first before linking it to their private servers, the growth of hybrid cloud among government and large corporations will see it hit 50% by 2017.

FileCloud is a leading private cloud solution that can substitute public cloud or augment public cloud to create a hybrid solution. Here is an example where FileCloud helped Indian Government to launch their first private cloud solution .

Author: Davis Porter

Image courtesy: Stuart Miles,

Take a tour of FileCloud

HIPAA 101 – An introduction to HIPAA 

HIPAA Guidance


HIPAA, otherwise known as the Health Insurance Portability and Accountability act, which was first introduced in 1996, demanded that the department for human services and health in the U.S. (HHS) should create specific regulations  to protect the security and privacy of health information. In order to properly fulfill this new requirement, the HHS published the HIPPA security rule, and the HIPAA privacy rule. The privacy rule, otherwise referred to as the standards for privacy of individual health information, establishes the standards that should be used nationally to protect health information. The security rule addresses the non-technical and technical safeguards that covered entities needed in place to ensure individuals’ information (or e-PHI) remains secure.

Within the HHS, the office of civil rights is responsible for enforcing the security and privacy rules, utilizing voluntary compliance activities, and penalties. Before HIPAA was introduced, there was no widely accepted set of general requirements or security standards available for protecting the health information that exists in the care industry. However, new technologies have continued to evolve, and the health care industry has started to move away from the process of using paper, to rely more heavily on utilizing electronic information systems to answer eligibility questions, pay claims and conduct various other clinical administrative functions.

HIPAA today

Currently, providers are using clinical applications such electronic health records, computerized physician order entries, and electronic pharmacy, laboratory and radiology systems. Health plans more regularly provide access to care management and claims, as well as various self-service options, meaning that the workforce in the medical industry has become more efficient and mobile. However, the rising online adoption has increased the potential security risks that are emerging.

One of the primary goals of the security rule is to ensure that individuals’ private health information remains secure, while allowing certain entities to engage with new technologies and improve the way the patient care can work. Because the marketplace in healthcare is so vast and diverse, the security rule needed to be versatile and flexible enough to give covered entities access to policies, technologies and procedures appropriate for that entity’s size and organizational structure. At the same time, it has to make sure it doesn’t limit innovations that help the industry and help in its cause to keep electronic healthcare information of patients private.

Like many simplification rules regarding Administration, the Security Rule applies to health care clearinghouses, health plans, and providers of healthcare who transmit information and data about health in electronic form in combination with a transaction for which standards have been adopted under HIPAA.

The Information that Is Protected

The HIPAA privacy rule is used to protect the privacy of individual health information, known as protected health information. The security rule, on the other hand, protects the subset of that information covered by the privacy rule, which can be any individual health information created, maintained, received or transmitted by a covered entity in an electronic way.

The security rule means that any covered entity must maintain the appropriate technical, physical and administrative safeguards established for protecting personal information. Covered entities need to ensure that all of the e-PHI they create, maintain, transmit or receive is confidential, and maintains its integrity. They must also take steps to identify potential threats to the security of that information, and protect it against problems.

The Security Rule and Confidentiality

According to the security rule, confidentiality can be defined as e-PHI that is not made available or disclosed to people who are not authorized to view it. The confidentiality requirements of the security rule directly support the privacy rule when it comes to improper disclosure and use of personal healthcare information. The security rule is also used to promote further goals of maintaining the availability and integrity of e-PHI. Beneath the security rule, integrity refers to the fact that personal healthcare information in an electronic medium cannot be destroyed or altered without authorization. Availability suggests that the e-PHI is usable and accessible on demand by any person who is authorized.

One important thing to remember about the HHS, is that it recognizes covered entities can range from incredibly small providers, to large nation-wide health plans. Therefore, the rule regarding security is scalable and flexible to ensure that covered entities are still able to assess their own needs, and create solutions that are appropriate to their particular environments. The rule will not dictate exact measures, but forces the entity to consider certain key factors, including:

  • The cost of security measures
  • The complexity, capability and size of the entity
  • The possible risk or impact to e-PHI
  • The software, hardware, or technical infrastructure.

Self hosted cloud such as FileCloud could help organizations in health care industry meet HIPAA standard. Here is a great example of how FileCloud helped Precyse to provide cloud features, while meeting HIPAA standards.

 Author: Rahul Sharma

image courtesy: Stuart Miles,

Why Government Should Focus On e-Discovery Tools


Would it surprise you to know that 93 percent of all documents in the world are created electronically, of which 70 percent never even go through a hard copy conversion? this fun fact is not from some 2015 survey, but from a study conducted in 2013!
Yes, the age of digital information dawned on us a long time ago, but we have barely scratched the surface of potential for empowerment from technology. Information harvesting is obviously the next great frontier for the tech industry, which is why the field of electronic discovery (e-discovery) has gained quite a bit of prominence over the past few years.

What is e-Discovery?
E-discovery essentially covers the searching, locating, and securing of data from a digital archive with the intent of using it as evidence to build a strong civil or criminal legal case by the government. It can be carried out offline or through a specific network. This court-ordered hacking practice can help governments stay one step ahead of cybercriminals and bust complex illegal operations through the intelligent analysis of data.

How Governments Can Benefit From e-Discovery
For one thing, no one would ever chose a tower of paperwork stuffed over a laptop. The manual scrutiny and resources involved in managing paperwork are simply ludicrous when compared to digital data. Furthermore, digital files can also be recovered and un-deleted once they have made their way into a network and appeared on multiple hard drives. E-discovery tools allow you to parse through all kinds of data for examination in civil or criminal litigation, including:

  • E-mail
  • Images
  • Calendar files
  • Databases
  • Spreadsheets
  • Audio files
  • Animation
  • Web sites
  • Computer programs
  • Malware

You may have heard the term ‘cyberforensics’ being thrown around casually on a CSI episode at some point. What most people are unaware of that it is merely a specialized version of e-discovery used to create a digital copy of the suspect’s computer hard drive for extraction of evidence while preserving the original hardware at a secure location.
ediscovery referenc model

The necessity of such instruments has been acknowledged by governments worldwide as they have been consistently passing amendments and new bills to meet the challenges of administration in the 21st century.  The defense and prosecution departments in particular have helped the e-Discovery market blossom to $1.8 billion in 2014 and an expected to grow to $3.1 billion by 2018.

Professor Richard Susskind, also known as Moses to the Modern Law Firm in his circles, noted in his law firm technology predictions that many legal firms will move their data and processing to the cloud and e-discovery tools can play a big role in adding fuel to the fire by revolutionizing the process of investigation.

Although e-discovery tools are unquestionably a gamechanger for litigation cost management, there are a variety of legal, constitutional, political, and personal roadblocks that can challenge the efficacy of electronically stored information.

In the seventh annual benchmarking study of e-discovery practices for government agencies conducted by Deloitte, it was found that nearly 75 percent of respondents felt that they lacked adequate technical support when dealing with opposing counsel.

The handling, processing, reviewing or producing electronically stored information in compliance with the Federal Rules of Civil Procedure is a challenge that gives even the most tech-savvy legal team sleepless nights.

How the Future of e-Discovery is Only Going to Get Brighter

While there are no straightforward solutions to these tribulations, it does not mean that agencies have to resign themselves to the mercy of limitation.
Cutting-edge big data technology is booming, and dedicating resources to integrate better search and review technologies, visualization tools, extensive ESI harvesting mechanisms, etc., can address these issues rather competently. Let us take a look at some of the technologies that can help take e-discovery to the next level:

Predictive Coding

While near-duplication techniques, clustering, and email threading helps organize the massive volume of data extracted from the documents under review, they do not offer much responsiveness or clarity in communicating the relevance of data to match the user’s needs.
Predictive coding helps pinpoint more accurate electronically stored information by rating documents on the degree of similarity in concept and terms, which is a far superior and robust analytical technique compared to other options.

Predictive coding can also be used to classify important documents vital in establishing claims or defenses.  This helps in the constant refinement of the search and review process to pass more strategic information to users.

Data Visualization

Infographics and other forms of visualized content are special because they make an impact on our understanding by stimulating our natural instincts of patternicity (the human tendency to identify patterns).

Visualization tools utilize machine learning and analytics to identify anomalies and build a relationship web with the information available to help review teams locate privileged materials more easily.

Effective Employee Training

Government agencies must maintain the highest standards of data investigation and processing to ensure their case is not flawed with irregularities or irrelevant information. By hiring a skilled review team, agencies can enforce much needed quality checks to guarantee the fulfillment of review goals and streamline workflows by eliminating any redundancies in this process.

With increased pressure from sequestration cuts and an ever-expanding pile of pending cases to tackle, e-discovery technology is clearly the way forward. What matters now is how fast agencies can train their staff to catch up with the technological boom and develop the necessary skill sets to harvest precious insights from an ocean of information.

Author: Prashant Bajpai

Image courtesy: Stuart Miles/

The Ultimate Guide to HIPAA Compliance

With governments ramping up their efforts to clamp down on unscrupulously liberal data industry practices to combat the growing risks of identity theft and privacy violations, it is safe to say the digital information industry is not the Wild, Wild West anymore.

Data governance in the healthcare industry is one of the areas that have witnessed a lot of scrutiny and revaluation of policies over the past few years as organizations scrambled to comply with HIPAA standards and ensure the security of protected health information (PHI).

But here’s a question – why does the federal Health Insurance Portability and Accountability Act still give nightmares to people despite being launched in 1996?

To understand this, you must first consider how the advent of advanced digital data storage and distribution platforms has raised the stakes for the healthcare industry to manage sensitive healthcare information, facilitate insurance cases, and control administrative costs like never before. Unfortunately, healthcare institutions have never faced such tremendous pressure in keeping up with technology before as well as dealing with complex unforgiving regulations at the same time.

ePHI is merely PHI that is stored or transmitted electronically (i.e. via email, text message, web site, database, online document storage, electronic FAX, etc

According to NIST guidelines, an individual or team in the organization must be tasked with the responsibility for ensuring HIPAA compliance and accepting business risk.

Once a clear business owner is established, a cross-disciplinary group, including the technical, legal, and HR departments must work in tandem to ensure that the policies are appropriately defined appropriately and correctly implemented to fulfill the objectives of the data governance strategy.

According to the HHS, no authority can grant any kind of certification to prove your organization is HIPAA compliant. HIPAA compliance is an ever-evolving process for organizations instead of a one-time landmark event, which is why deploying a tool to aggregate compliance metrics for tracking your documentation storage and sharing policies can come in handy.

Given the vitality of protected health information (PHI) and HIPAA compliance, let us dig a little deeper and highlight the four key rules you will need to address:

  1. HIPAA Privacy Rule
  2. HIPAA Security Rule
  3. HIPAA Enforcement Rule
  4. HIPAA Breach Notification Rule

The general guidelines of the HIPAA Security Standard demonstrate a technology-neutral approach, which means that there are no biased leanings towards certain technological systems or cloud services as long as the data protection regulations are being addressed seriously. However, the HIPAA language does introduce two classes of policy standards for the consideration of data management teams:

  1. Required (R) – mandatory compliance standards
  2. Addressable (A) – should be implemented unless detailed risk assessments can prove that their implementation is not required or favorable to their business setting.

HIPAA Administrative Requirements

Addressable Required
Employee Oversight: (A) Employ procedures to permit, revoke access, and administer employees working with protected health information. Risk Analysis: (R) Execute a risk analysis program to evaluate the key areas of storage and utilization of PHI to identify vulnerable points in the system.
ePHI Access: (A) Implement procedures for providing employee access to PHI and document all services that grant access to ePHI. Risk Management: (R) Implement actions adequate enough to mitigate risks to a suitable level.
Security Reminders: (A) Occasionally provide security and privacy policy updates and reminders to employees. Sanction Policy: (R) Put into practice sanction policies for employees who violate compliance standards.
Protection against Malware: (A) Implement procedures to detect and safeguard against malware. Information Systems Activity Reviews: (R) Recurrently examine system activity and logs.
Login Monitoring: (A) Report all system logins and track any discrepancies observed. Officers: (R) Appoint HIPAA Security and Privacy Officers
Password Management: (A) Create secure and easily accessible protocols for password modification and recovery. Multiple Organizations: (R) Ensure unauthorized access by third party or parent organizations is warded off.
Contingency Plans Updates and Analysis: (A) Establish periodic testing routines and regular examination of contingency plans. Response and Reporting: (R) Track and document all security-related actions.
Contingency Plans: (R) Create accessible PHI backups for easy restoration of stolen or lost data.
  Emergency Mode: (R) Establish contingency protocols to guarantee the unhindered workflow of critical business processes remains unaffected.
  Evaluations: (R) Carry out episodic evaluations to ensure your data governance is in compliance with all the latest laws and regulations.
  Business Associate Agreements: (R) Establish special Omnibus-compliant contracts with business partners who can access your facility’s PHI in order to guarantee their compliance.


HIPAA Physical Requirements

Addressable Required
Contingency Operations: (A) Implement procedures that provide facility access in support of restoration of lost data as part of your organization’s disaster recovery and emergency plan. Workstations: (R) Apply policies to control the configuration of software on systems. You can also safeguard all workstations by restricting admission to authorized users.
Facility Security: (A) Implement policies and procedures to protect all ePHI stored on facility. Devices and Media Disposal and Re-use: (R) Take measures for the secure disposal and reuse of all media and devices handling ePHI.
Access Control and Validation: (A) Initiate measures to control and validate individual authentication based on their designation as well as control access to testing software.
Media Movement: (A) Log all hardware movements related to ePHI storage.


HIPAA Technical Requirements

Addressable Required
Automatic Logoff: (A) Establish an automatic suspension point of electronic procedures after a fixed period of inactivity. Unique Tracking: (R) Allocate a unique name/number for the purpose of identifying user identity.
Encryption and Decryption: (A) Set up an electronic apparatus to encrypt and decrypt ePHI as and when necessary. Audit Controls: (R) Employ hardware and software tracking gateways to track and record all activity pertaining to the use of ePHI.
ePHI Integrity: (A) Apply policies to safeguard ePHI from unsanctioned revision or damage. Authentication: (R) Implement protocols to verify authentication of all personnel seeking access to ePHI.
Transmission Security: (A) Execute technical security measures to prevent unlawful access to sensitive ePHI being transmitted over a network.


Click here to explore how on-premise cloud such as FileCloud help you to comply with various data governance regulations.

Author: Prashant Bajpai

image courtesy: Stuart Miles/

How to Pick the Right Storage Solutions for Government Agencies

Government Data Storage

Apart from businesses and corporations, government agencies are other major consumers of storage solutions nationwide. Before the Federal Data Center Consolidation Initiative, government agencies largely relied on in-house storage systems that compromised of thousands of data storage drives in unconsolidated warehouses. To establish a new branches, government agencies had to build new data centers to service them. Ultimately, hundreds of data centers were distributed all over the country, wasting tax-payers money which would otherwise have been used to improve critical services.

After the president insisted on the need to positively spend every federal dollar, the FDCCI was enacted to facilitate shrinkage of the data centers to reduce the expenditure on underutilized real estate, software, hardware and manpower. Since then, government agencies have been gradually adopting transformational storage solutions like the cloud to consolidate their data, improve their operations and overall efficiency. According to a MeriTalk report titled “The FDCCI Big Squeeze”, more than 60% of government IT managers can now comfortably assign their staff other more critical tasks thanks to the consolidation initiative. The report further indicates that 57% of government agency managers are saving costs on energy consumption, subsequently utilizing the funds in other critical operations. This directive has therefore largely revolutionized government storage systems.

Although the directive applies to all the government agencies, the precise storage solutions depend on the needs of the respective agencies. The storage principles may be similar but the framework and architecture is different in various agencies. That’s why it’s critically imperative for government agents to comprehensively understand how they can evaluate their respective needs to pick the right storage solutions for their agencies. Here are fundamental points to consider:

Type of Data

Although it’s basically a system of 1’s and 0’s, all data is not equal. While some, for instance, may be significantly confidential, other types of data may be open to public access. Additionally, some agencies deal in fairly huge amounts of data on a regular basis while others are confined to small data sets that correspond to their small scale operations.

The answers to following questions should be helpful in evaluating the type of data your agency deals with:

  • Is it public or confidential?
  • How secure should the data be?
  • How much time do you need to retain the data?
  • What type of access is required?
  • Is it large or small scale?


The type of storage system you choose for your government agency should be convenient and suitable for your type of data. The NSA for instance, uses a fairly expansive, consolidated data center built in Utah that’s very effectual for thousands of zettabytes or yottabytes of confidential data. Smaller government agencies on the other hand, use small scale data centers optimized for their operations.


Data is not static- It keeps growing and changing. Similarly, government agencies keep expanding and shifting to new areas to boost their operations. Since it’s considerably expensive to purchase new hardware to accommodate growth, it’s advisable to use a storage solution that’s flexible enough to accommodate steady expansion.

Government agencies that have already consolidated their data in the cloud are already enjoying its scalability and increased performance. They can efficaciously upload and process data, then scale up the storage space and performance according to their fluid requirements.

Compliance Levels

To optimize service delivery, the government has implemented various compliance levels in its departments, ranging from data handling to inter and intra-departmental communication. Some of the agencies like security and finance are more regulated than the rest because of the sensitivity of the data they handle.

To ensure complete adherence to the compliance levels, you should critically assess them and compare them to your storage options. If you are outsourcing the data to a third party cloud service provider for instance, ensure that he complies with the respective compliance levels and has the relevant credentials as required.


Due to the sensitive nature of governmental data, accessibility is often controlled with the most confidential data only accessible to individuals with the highest credentials. The 1 million square feet NSA data center in Utah for instance, is secured by high perimeter walls and is only open to high ranking NSA officials. The rest of the agents access data according to their clearance levels through the NSA network that links the database to remote computers.

Accessibility should therefore be a primary concern as you consider the right storage solution. How many people will access the data? Which framework will be implemented to control access according to the credentials? Is the data accessible to third party storage service providers?

Although access control is most critical to sensitive data, all governmental data access should be systematically regulated to avoid data loss, and any other perils which may arise due to unauthorized access and distribution.


According to the Global Information Assurance Certification, cyber warfare is exceedingly developing to be the primary combat tactic of this century. The United States is increasingly facing cyber threats, with a majority of the attackers targeting governmental organizations. Infiltrating the organizations would leak out United States secrets, as was seen with the WikiLeaks diplomatic crisis, which put the State department under immense pressure after leaking out thousands of confidential cables. Additionally, attackers are seeking to cripple the organizations to cut off some of the basic services delivered to the American citizens.

The best and most effectual strategy of protecting your organization from such attackers is securely storing all the data in servers protected by multiple levels of impenetrable security protocols. Of course such protocols comprise of up-to-date firewalls, anti-malware, anti-virus and anti-hacking systems that can detect and repel all types of cyber-attacks. Additionally, physical protection should be provided to prevent physical access to the respective data centers and servers.

As you assess the various critical elements to determine the right storage solution, it’s advisable to engage the experts who will advise you further on how to implement them. You should also to do a keen analysis of the existing storage systems successfully implemented by other governmental organizations to have a vivid comprehension of exactly what you need.  Finally, ensure that all the data storage systems are consolidated and virtualized such that they are all remotely accessible to all relevant stakeholders.

In addition to picking the right storage, governments need a good on-premise cloud solution. Many government organizations from NASA to government of India trust FileCloud to run their on-premise cloud. Here are some case studies:

Author: Davis Porter
Image Courtesy: Vichaya Kiatying-Angsulee/

A Dummy’s Guide to EU Data Protection Laws

EU Data Protection

As the digital landscape evolves at a breakneck pace, we know now that technology laws such as Moore’s law are living up to its expectations. The question is – are our civil and criminal laws really up to the challenge of safeguarding the integrity and privacy of data in this domain?

If the numerous corporate hacking cases and government spying scandals are anything to go by, governments and enterprises across the world are beginning to wake up to the reality of instituting a uniform digital data protection framework as the only way to effectively deal with these problems.

Ponemon Institute study estimated that the average cost per lost/stolen record due to a data breach is $188, whereas the average cost of an organizational data breach is estimated to be around $5.4 million. However, it was more concerning to see the lack of data protection accountability measures being taken to address this, which left a dangerous backdoor for potentially catastrophic data theft or loss incidents to run rampant in the future.

In light of this, the European Commission undertook the task of overhauling EU data protection rules to help secure the personal data of citizens and corporations based in European Union territories.

In response to mass surveillance cases (infamous NSA surveillance), the Civil Liberties Committee reiterated the need for stronger safeguards for data transfers to non-EU countries along with a series of strict regulations to promote secure data storage and transfer practices.

This revitalizing directive finally takes into significant technological developments in social networking and cloud computing platforms to determine the right data protection and privacy policies.

The European Commission’s aim to create a pan-European law to reinforce the old patchwork of national laws is a great initiative for organizations since they do not have to face the headache of complying with the regulations set forth by multiple authorities.

On the other hand, many organizations take a lackluster approach to handling data breach incidents. They refuse to invest significant resources in devising a long-term data protection program, and tip their hats in contempt at the Information Control Officer (ICO) believing they will get away with a miniscule fine.

EU Data Protection Regulations are a big step forward in the field of compliance and risk management because they put the escapist brouhaha culture of data security on trial and define the vitality and repercussions of risk ownership like never before. It rightly shifts the spotlight from infrastructure-exclusive security to a more proactive, data-centric strategy.

How the GDPR (General Data Protection Regulation) Will Shake Up Data Industry Practices

In a world with increased decentralization of resources owing to the rise in private, public, and hybrid cloud computing players, the General Data Protection Regulation (GDPR) plans to unify data protection within the EU with a single law, and put the onus of security on everyone responsible for the management of the data cycle.

However, a recent survey conducted by FireEye focusing on the readiness for cybersecurity regulations in Europe, indicated that only 39percent were confident they had optimized their data protection to meet all requirements. For now, a quarter of respondents claimed that investment in new hardware and software infrastructure to fulfill the GDPR demands was the biggest challenge ahead of them.


Data-protection governance is the need of the hour as it drives businesses to take charge of their data policies, risk assessments, and control requirements, so that it elevates performance standards and bring in some much-needed accountability for their decision making. The GDPR is just a way to fast-track this change.

Now that you know about the necessity of this game-changing policy directive, it’s time to examine how the European Data Protection Regulation plans to shake things up in the information industry:


Coverage Scope

The GDPR covers all data controllers and data subjects based in the EU. It also applies to organizations based outside the EU that process the personal data of its residents.

According to the EC, the definition of personal data covers anything that points to their professional or personal life, including names, photos, emails IDs, bank details, social networking posts, medical information, or computer IP address.

There will be a Single Data Protection Authority (DPA) assigned to each company depending on where the company is located who will report to the European Data Protection Board. They must be appointed for all public authorities and companies processing more than 5000 data subjects within 12 months.


Although previous data processing notice requirements remain intact, they must also specify the retention time for personal data and provide their contact information to customers. The Privacy by Design and Privacy by Default clauses in Article 23 mandate that data protection protocols must be integrated into the business development process itself. All privacy settings must be set to high by default.

Data Protection Impact Assessments (Article 33) have to be conducted when specific risks occur to the rights and freedoms of data subjects.

Proof of Consent

Article 7 and Article 8 specify that data controllers must possess a valid proof of consent for processing data and acquire special permissions for collecting the data of children under 13 from their legal guardians.

Instant Breach Alerts

Article 32 says that any case of data breach must be reported to the DPA by the controller within 72 hours of discovering the issue so that all parties involved can be warned about the situation and take precautionary measures.

Severe Sanctions

Instances of first unintentional cases of non-compliance will be doled out written warnings by the DPA. As a result, organizations will also be directed to conduct regular data protection audits.In case of graver offenses, organizations may have to cough up a deadly fine up to 1,000,000 EUR or up to 2% of the annual worldwide turnover in case of an enterprise, whichever is greater (Article 79).

Right to Erasure

Article 17 empowers data subjects by giving them the right to request removal of personal data related to them on any one of a number of grounds, including cases where the fundamental rights of the data subject take precedence over the data controller’s interests and require protection.

Portability of Data

According to Article 15, users will also be allowed to request a copy of personal data being processed so that they have the freedom to transmit it to another processing system if needed.

On-premise private cloud solutions such as FileCloud help organizations to keep their data in servers within their firewall, while providing all the flexibility and access advantages of public cloud such as Dropbox. Additionally, FileCloud’s unique capabilities to comply with EU regulations, and features to monitor, prevent, and fix any data leakage across devices (Laptops, Desktops, Smartphones and Tablets). Learn more at

 Author: Prashant Bajpai

Image Courtesy  Stuart Miles,

Why Governments Should Focus On Mobile Device Management


The meteoric rise of BYOD is not just a run-of-the-mill technological reality check that organizations have to bear once in a while, but rather a gateway that promises to bring seamless connectivity to data sharing. It is undeniable that smartphones and tablets have emerged as the building blocks to a new form of business intelligence that can be unleashed with cloud-based services.

Thanks to the immense diversity and potential of cloud resources, coupled with uber-customizable software applications, organizations can empower their workers and organize workflows better than ever before.

Engineering Productivity – A New Hope for Government Efficiency

In light of this paradigm shift, government agencies (aside from the military), notorious for being latecomers to technological advancements, have started laying the mobile-powered blueprint for the “Government on the Go” dream to come true.

Engineering efficiency while minimizing expenses is the heart of the public sector workflow agenda, and it looks like mobile-enabled governance is definitely going to serve them well.


A recent report from Deloitte on the subject of public sector mobile technology deployment revealed that the claims of productivity boosts are certainly not exaggerated.

It showed that only 7 percent of U.S. Federal government employees engage in teleworking despite 32 percent being eligible to do so. Even if they all teleworked for half their working hours, it could reduce absenteeism, cut down office costs, and heighten productivity to provide a total savings of $5.4 billion per year.

Think about it! Whether it is 30 minutes of daily extra field time for police officers, or 2 hours saved by caseworkers, the productivity effect adds up.

It is not just a boon for public sector workers, but also a great relief to the pocket of taxpayers.

However, the bulk of the information transmitted through mobile is sensitive data that IT must safeguard. At the same time, IT must be able to respect the privacy of workers and deploy problem-specific mobile applications that add efficiency to the workflow.

Hence, a reliable MDM solution is of paramount importance in order to milk the true potential of mobile-enabled governance.

NASA is one of the few government agencies that walk a fine line between device and data management policies. Even though a huge portion of their scientific data is accessible for free to the public, it also has to ensure the integrity of its back-end business and operational networks is not compromised.

NASA manages access from employee mobile devices using a Defense Department mobile security standard.

Currently, the agency is experimenting with a number of mobile device management contracts across its numerous facilities and also setting up its own BYOD effort, borrowing structure from the mobile program set up for the Nuclear Regulatory Agency.

2 Key Concerns MDM Vendors Must Address

  1. Authentication Standards

As of now, mobile authentication seems to be the chink in the armor that cyber security agencies must fortify. Government workflows generally utilize a multi-factor authentication protocol for sanctioning access to sensitive data.

A reliable multi-factor authentication approach allows the independent validation of three key security concerns:

  • User-exclusive knowledge
  • Ownership of the device registered with the agency
  • The role of the user in the agency

Conventionally, multi-factor authentication is carried out through protocols such as inputting a passcode generated by a physical security token, providing a special key card, or verifying your identity biometric tools.

  1. Employee Awareness

In a research study conducted by Mobile Work Exchange on the road to mobile readiness, there were strong indications of potential security breaches due to inefficient employee training.

Nearly 31 percent of mobile ready agencies work without receiving basic remote working training for mobile devices, and 18 percent have never received any security-related training.

Human error is the leading cause of security failure, and even the most robust mobile device management solution must be supplemented with good tutorials to empower the workers and ensure bulletproof data and system security practices.

Consumerization Report

The Way Forward

A few government agencies are also focusing on centralization of IT and mobile device support functions. They issue tablet computers and smartphones to designated personnel who can log into the network and access information via a virtual desktop interface (VDI).

In order to achieve their vision of highly granular control and easy data migration in the face of hardware and software obsolescence, government agencies must acquire an MDM system that offers the following features:

  • Secure Container – To ensure only fully-certified apps operate and transfer data between each other.
  • App-level Security Policies – Enhanced password authentication standards to launch applications
  • Secure Network Access – Authenticate devices and sanction access only to devices provisioned to fixed servers and services.
  • Powerful Encryption Standards – Strong AES encryption to safeguard data in-transit as well as stored data.

Although BYOD policies offer a great deal of employee liberation, they also present some key security and data access management challenges to ensure there is no costly compromise for the sake of convenience. Hence, a reliable mobile data management solution is essential in stabilization of workflow and protection from malware, security breaches, and managing employee access as per their clearance level.

 Author: Prashant Bajpai, noppasinw