Archive for the ‘government’ Category

When Does AWS GovCloud Make Sense?

 

With GovCloud, AWS has successfully managed to revolutionize the game by providing an extensive and surefire way to not only implement but also manage business technology infrastructure. By providing services based on their own back-end technology infrastructure, which they have spent over a decade perfecting, AWS guarantees one of the most reliable, cost-efficient and scalable web infrastructures. GovCloud was launched in 2011 to satisfy stringent regulatory requirements for local, state and federal governments. Its efforts to meet regulatory standards and increase feature consistency between its public sector and commercial solutions has led to the addition of dozens of new services and nine new private-sector regions across the planet. This enables the IT departments within agencies to reap similar benefits from cloud computing enjoyed by all other AWS users, such as improved scalability and agility and greater alignment of costs.

Amazon explains that GovCloud tackles specific regulatory and compliance requirements such as the International Traffic in Arms Regulations (ITAR) that regulates how defense-related data is stored and managed. In order to guarantee that only designated individuals within the United States have access, GovCloud segregates the data both physically and logically. AWS GovCloud is not limited to the government agencies; the region is also available to vetted organizations and contractors who operate in regulated industries. For example, government contractors have to secure sensitive information.

When Does AWS GovCloud Make Sense?

I. High Availability Is Important to Mission Critical Applications

Building a highly available, reliable infrastructure on an on-premise data center is a costly endeavor. AWS offers services and infrastructure to build fault-tolerant, highly available systems. By migrating applications and services to AWS GovCloud, agencies not only benefit from the multiple features of cloud computing but also instantly reap improvements in the availability of their applications and services. With the right architecture, agencies get a production environment with a higher availability level, without any additional processes or complexity.

Some of the services GovCloud users can access to get this easy out-of-the-box redundancy, durability and availability include: EC2 coupled with auto-scaling – for scalable capacity computing; VPC – to provision private isolated AWS sections; Elastic Load Balancing (ELB) – to automatically distribute incoming application traffic across multiple EC2 instances; direct connect – to establish a private connection between an AWS GovCloud region and your data-center; and Elastic Beanstalk – to deploy and scale web apps and services.

II. Big Data Requires High-Performance Computing

User productivity and experience are key considerations, and both hinge on the performance of applications in the cloud. Government agencies typically amass huge sets of data that carry crucial insights. AWS GovCloud allows you to spin up large clusters of compute resources on-demand, while only paying for what you use and obtaining the business intelligence required to fulfill your missions and serve your citizens. Additionally, GovCloud avails low-cost and flexible IT resources, so you can quickly scale any big data application, including serverless computing, Internet of Things (IoT) processing, fraud detection, and data warehousing. You can also easily provision the right size and type of resources you require to power your big data analytics applications.

III. High Data Volume Means Higher Storage and Backup Needs

A major consideration when migrating to the cloud is secure, scalable storage. For government organizations, this need is amplified, not only because of the volume of data that needs to be stored, but also because of the sensitive nature of said data. AWS provides scalable capacity and direct access to durable and cost-effective cloud storage managed by U.S. persons, while satisfying all security requirements. GovCloud users have access to multiple storage options, ranging from high-performance object storage to file systems attached to an EC2 instance. AWS also offers a native scale-out shared file storage service, Amazon EFS, that gives users a file system interface and file system semantics. Amazon Glacier and S3 provides low-cost storage options for the long-term storage of huge data sets.

Customers can have the information stored in Redshift, Glacier, S3 and RDS automatically encrypted with a symmetric-key encryption standard that utilizes 256-bit encryption keys. Additionally, using very simple approaches, IT systems can be backed up and restored at a moment’s notice.

IV. Critical Applications Should Scale With User Demand

Predictable workloads may require reserved instances during spikes, and such payloads need on-demand resources. AWS utilizes advanced networking technology built for scalability, high availability, security and reduced costs. Using advanced features such as elastic load balancing and auto-scaling, GovCloud users can easily scale on demand. Auto-scaling enables government agencies to maintain application availability by dynamically scaling your EC2 capacity up or down depending on the specified conditions. Amazon Elastic Cloud Compute (EC2) provides re-sizable, secure compute capacity in the cloud. It is built to make web scale computing simpler, enabling users to efficiently and quickly scale capacity as computing requirements change.

In Closing

As the number of government organizations moving to the cloud continues to rise, these organizations will require a platform for compliance and risk management – a place where confidential, sensitive or even classified data and assets remain secure. GovCloud provides a quick way for government agencies to host and update cloud data and applications so that contractors and employees can focus on service delivery rather than managing server infrastructure.

Government organizations can take full advantage of GovCloud and all that it has to offer via content collaboration software. FileCloud on AWS GovCloud is an ideal solution for government agencies that want complete control and security of their files.
Click here to learn more about FileCloud on AWS GovCloud.

 

Author: Gabriel Lando

FileCloud Empowers Government Agencies with Customizable EFSS on AWS GovCloud (U.S.) Region

FileCloud, a cloud-agnostic Enterprise File Sharing and Sync platform, today announced availability on AWS GovCloud (U.S.) Region. FileCloud is one of the first full-featured enterprise file sharing and sync solutions available on AWS GovCloud (U.S.), offering advanced file sharing, synchronization across OSs and endpoint backup. With this new offering, customers will experience the control, flexibility and privacy of FileCloud, as well as the scalability, security and reliability of Amazon Web Services (AWS). This solution allows federal, state and city agencies to run their own customized file sharing, sync and backup solutions on AWS GovCloud (U.S.).

“Having FileCloud available on AWS GovCloud (U.S.) provides the control, flexibility, data separation and customization of FileCloud at the same time as the scalability and resiliency of AWS,” said Madhan Kanagavel, CEO of FileCloud. “With these solutions, government agencies can create their own enterprise file service platform that offers total control.”

Government agency and defense contractors are required to adhere to strict government regulations, including the International Traffic in Arms Regulations (ITAR) and the Federal Risk and Authorization Management Program (FedRAMP). AWS GovCloud (U.S.) is designed specifically for government agencies to meet these requirements.

By using FileCloud and AWS GovCloud (U.S.), agencies can create their own branded file sharing, sync and backup solution, customized with their logo and running under their URL. FileCloud on AWS GovCloud offers the required compliance and reliability and delivers options that allow customers to pick tailored cloud solutions. FileCloud is a cloud-agnostic solution that works on-premises or on the cloud.

“FileCloud allows us to set up a secure file service, on servers that meet our clients’ security requirements,” said Ryan Stevenson, Designer at defense contractor McCormmick Stevenson. “The easy-to-use interfaces and extensive support resources allowed us to customize who can access what files, inside or outside our organization.”

Try FileCloud for free!

GDPR – Top 10 Things That Organizations Must Do to Prepare

May 25, 2018 – that’s probably the biggest day of the decade for the universe of data on the Internet. On this date, Europe’s data protection rules –  European General Data Protection Regulation (GDPR) – becomes enforceable. In 2012, the initial conversations around GDPR began, followed by lengthy negotiations that ultimately culminated in the GDPR proposal. At the time of writing this guide (Sep 2017), most European businesses have either started making first moves towards becoming compliant with GDPR, or are all set to do so. Considering how GDPR will be a pretty stringent regulation with provisions for significant penalties and fines, it’s obvious how important a topic it has become for tech-powered businesses.

Now, every business uses technology to survive and thrive, and that’s why GDPR has relevance for most businesses. For any businessman, entrepreneur, enterprise IT leader, or IT consultant, GDPR is as urgent as it is critical. However, it’s pretty much like the Y2K problem in the fact that everybody is talking about it, without really knowing much about it.

Most companies are finding it hard to understand the implications of GDPR, and what they need to do to be compliant. Now, all businesses handle customer data, and that makes them subject to Data Protection Act (DPA) regulations. If your business already complies with DPA, the good news is that you already have the most important bases covered. Of course, you will need to understand GDPR and make sure you cover the missing bases and stay safe, secure, reliable, and compliant in the data game. Here are 10 things businesses need to do to be ready for GDPR.

Top 10 things that organizations should do to prepare and comply with GDPR

1.      Learn, gain awareness

It is important to ensure that key people and decision makers in your organization are well aware that the prevailing law is going to change to GDPR. A thorough impact analysis needs to be done for this, and any areas that can cause compliance issues under GDPR needs to be identified. It would be appropriate to start off by examining the risk register at your organization if one exists. GDPR implementation can have significant implications in terms of resources, particularly at complex and large organizations. Compliance could be a difficult ask if preparations are left until the last minute.

2.      Analyze information in hand

It is necessary to document what personal data is being held on hand, what was the source of the data, and who is it being shared with. It may be necessary for you to organize an organization-wide information audit. In some cases, you may only need to conduct an audit of specific business areas.

As per GDPR, there is a requirement to maintain records of all your activities related to data processing. The GDPR comes ready for a networked scenario. For instance, if you have shared incorrect personal data with another organization, you are required to inform the other organization about this so that it may fix its own records. This automatically requires you to know the personal data held by you, the source of the data and who it is being shared with. GDPR’s accountability principle requires organizations to be able to demonstrate their compliance with the principles of data protection imposed by the regulation.

3.      Privacy notices

It is important to review the privacy notices currently in place and put in a plan for making any required changes before GDPR implementation. When personal data is being collected, you currently need to provide specific sets of information such as information pertaining to your identity and how you propose to use that information. This is generally done with a privacy notice.

The GDPR requires you to provide some additional information in your privacy notices. This includes information such as the exact provision in the law that permits asking for that data and retention periods for the data. You are also required to specifically list that people have a right to complain to the ICO if they believe there is a problem with the way their data is being handled. The GDPR requires the information to be provided in the notices in easy to understand, concise and clear language.

4.      Individual rights

You should review your procedures to confirm that they cover all the individual rights set forth in the GDPR. These are the rights provided by the GDPR.

  • To be informed
  • Of access
  • To rectification
  • To erasure
  • To restrict processing
  • To data portability
  • To object
  • To not be subject to automated profiling and other such decision-making

This is an excellent time to review your procedures and ensure that you will be able to handle various types of user requests related to their rights. The right to data portability is new with the GDPR. It applies:

  • To personal data provided by an individual;
  • When processing is based on individual consent or to perform a contract; and
  • Where processing is being done by automated methods.

5.      Requests for Subject access

You would need to plan how to handle requests in a manner compliant with the new rules. Wherever needed, your procedures will need to be updated.

  • In most of the cases, you will not be allowed to charge people for complying with a request
  • Instead of the current period of 40 days, you will have only a month to execute compliance
  • You are permitted to charge for or refuse requests which are apparently excessive or unfounded
  • If a request is refused, you are required to mention the reason to the individual. You are also required to inform them that they have the right to judicial remedy and also to complain to the correct supervising authority. This has to be done, at the very latest, within a month.

6.      Consent

It is important to review how you record, seek and manage consent and if any changes are required. If they don’t meet the GDPR standard, existing consents need to be refreshed. Consent must be specific, freely given, informed, and not ambiguous. A positive opt-in is required and consent cannot be implied by inactivity, pre-ticked boxes or silence. The consent section has to be separated from the rest of the terms and conditions. Simple methods need to be provided for individuals to take back consent. The consent is to be verifiable. It is not required that the existing DPA consent have to be refreshed as you prepare for GDPR.

7.      Aspects related to children

It would be good if you start considering whether systems need to be put in place in order verify the ages of individuals and to get consent from parents or guardians for carrying out any data processing activity. GDPR brings in specific consent requirements for the personal data of children. If your company provides online services to children, you may need a guardian or parent’s consent so as to lawfully process the children’s personal data. As per GDPR, the minimum age at which a child can give her consent to this sort of processing is set to 16. In the UK, this may be lowered to 13.

8.      Aspects related to data breaches

You should ensure that you have the correct procedures necessary to investigate, report, and detect any breaches of personal data. The GDPR imposes a duty on all companies to report specific types of data breaches to the ICO, and in some situations, to individuals. ICO has to be notified of a breach if it is likely to impinge on the freedoms and rights of individuals such as damage to reputation, discrimination, financial loss, and loss of confidentiality. In most cases, you will also have to inform the concerned parties directly. Any failure to report a breach can cause a fine to be imposed apart from a fine for the breach by itself.

9.      Requirements related to privacy by design

The GDPR turns privacy by design into a concrete legal requirement under the umbrella of “data protection by design and by default.” In some situations, it also makes “Privacy Impact Assessments” into a mandatory requirement. The regulation defines Privacy Impact Assessments as “Data Protection Impact Assessments.”’ A DPIA is required whenever data processing has the potential to pose a high level of risk to individuals such as when:

  • New technology is being put in place
  • A profiling action is happening that can significantly affect people
  • Processing is happening on a large set of data

10.  Data protection officers

A specific individual needs to be designated to hold responsibility for data protection compliance. You must designate a data protection officer if:

  • You are a public authority (courts acting in normal capacity exempted)
  • You are an institution that carries out regular monitoring of individuals at scale
  • You are an institution that performs large-scale processing of special categories of data such as health records or criminal convictions

Many of GDPR’s important principles are the same as those defined in DPA; still, there are significant updates that companies will need to do in order to be on the right side of GDPR.

Author: Rahul Sharma

Sources

https://ico.org.uk/media/1624219/preparing-for-the-gdpr-12-steps.pdf

https://ico.org.uk/for-organisations/data-protection-reform/overview-of-the-gdpr/

 

 

Right Data Storage For Government Organizations: Public vs. On-Premise

self host vs public

Largely fueled by the US Federal Cloud Computing Strategy, many governmental institutions and organizations have been gradually migrating to the cloud. The cloud has not only proven to be beneficial to businesses, but also governmental organizations as they seek to provide improved services to its citizens. This has seen the government invest immensely, which according to the IDC’s projected spending, is currently at $118.3 million on public cloud and $1.7 billion on private cloud.

By the year 2012, more than 27% of the state and local government organizations had already adopted the cloud. After the Federal Data Center Consolidation Initiative started gaining traction in 2011-2012, the number increased tremendously to more than 50% by 2014. Although it doesn’t strictly limit the organizations to the cloud, the initiative was enacted as a strategy to reduce government expenditure and improve the quality of services rendered to citizens through the consolidation of data centers. All the data warehouses were merged into single data centers, with some government agencies opting to move to the cloud and others relying on single consolidated in-house data centers. As a result, a lot of underutilized real estate was saved, expenditure reduced, and according to the MeriTalk report, “The FDCCI Big Squeeze”, government IT staff and resources were re-assigned to more critical tasks.

What is the right data storage For Government organizations? public vs. on-premise. Although both approaches have proven to be beneficial, it’s important to critically compare them. This post covers some of the considerations in selecting right data storage for government: Security, Scalability, Costs, Service Delivery.  Government institutions are unique and they will ultimately strategize more efficiently based on their respective data-storage plans.

Security

In an era where cyber warfare is the most widely preferred method of engagement, government data centers face significant threats not just from domestic but also foreign hackers. One of the most recent major attacks was directed at the healthcare.gov database. The site, which was initially attacked in 2013, fell victim to hackers again in July 2014, who targeted a test server. Although no private data was lost, the attack shook Americans by exposing the federal system’s vulnerabilities.
To prevent a recurrence, the government has been taking data security very seriously by leveraging solutions which strictly adhere to security compliance. Most of the least sensitive data is transmitted through and stored within the cloud, which is managed by dedicated teams of third party security experts. The most sensitive data on the other hand, is best stored in in-house data centers, where access is strictly controlled by security agents, and regulated through anti-theft software solutions. A good example is the expansive NSA Data Center in Utah, which is tightly secured and can only be accessed by high ranking NSA officials.

Service Delivery

The type of services a governmental organization deals in significantly dictates the most efficacious delivery and storage infrastructure. Services like healthcare and tax filing, which are largely citizen-centered, are best delivered through the cloud- using websites and applications which are available to all citizens. Through this strategy, the government has managed to reduce queues within its central stations and facilitate remote access of government resources, even to citizens in the diaspora. Additionally, moving apps to the cloud has helped cut the annual expenditure by 21% among the affected organizations.
Some government organizations on the other hand, deal with data and services which are agent-centered and are only open to government agents. They can therefore conveniently use in-house solutions and avoid deploying sensitive apps and data to the cloud. The NSA again, falls in this category since it uses dedicated servers (as opposed to the cloud) to store and exclusively distribute most of the data among its agents.

Costs

Setting up, deployment, operational and maintenance costs are regarded as the most critical factors for evaluating the suitability of solutions for all types of organizations- business and government. On average, organizations use about 5-20% of the total lifecycle capital (depending on the scope of the respective data centers) to set up the necessary infrastructure- facilities, networks, devices and servers.  50-80% of the operating costs are expenses which cater for maintenance, upgrades, real estate rates, etc. Hence on-premise implementations could have very different cost profile based on the complexity and scale. On the other hand,  the cost for cloud data storage is linear since the logistics are indirectly outsourced to managed service providers. In many cases, adding an on-premise solution could cost significantly less for a organization that has already successfully set up a stable and comprehensive in-house data center.

Scalability

Both in-house and cloud solutions are scalable- but the former is a more complicated process compared to the latter. To upgrade or scale up in-house solutions, governmental organizations have to acquire all the requisite equipment to support their respective upgrades, plus labor to oversee and manage the entire process. Additionally, they are compelled to commit more real estate to expand their server rooms to accommodate larger data centers as the number of servers increase. Evidently, this process could be time consuming compared to the much simpler process of up-scaling public cloud storage. Public cloud servers are inherently elastic and can accommodate infinite scalability according to the growth of an organization.

Hybrid Storage Option

With 70% of organizations (including governmental organizations) reportedly using or evaluating hybrid solutions, it’s regarded the most effectual and convenient solution. It allows organizations to leverage both in-house and cloud storage solutions to take advantage of both sets of benefits. To use it in your government organization, you should first evaluate all your resources according to the benefits of both in-house and cloud data storage options. Your findings should be subsequently used to move only the most critical services or applications to the cloud and retain the rest within your data center for complete control.

Author: Davis Porter
Image Courtesy: Freedigitalphotos.net, jscreationzs

Top Cloud Security Trends for Government Organizations

goverment security trend

According to a report by the Rand Corporation, the cyber black market is progressively growing- hackers are now more collaborative than ever and consistently use sophisticated strategies to target and infiltrate data centers. In the past, they were driven by sheer notoriety and malice to attack data centers and ultimately prove their maneuver skills to their peers. Unfortunately, the trend gradually changed, and hackers are now driven by warfare agendas and the increasingly developing black market, where they sell valuable information to the highest bidders.

Of course their biggest preys are government data centers, which are particularly targeted by cyber armies with agendas against their respective target nations. In fact, governments now face more potentially damaging risks from cyber warfare than the regular type of engagement- In the former, a single individual with just a computer could successfully launch an attack against major government cloud databases, cripple them, and cause significant socio-economic damages. One of the most recent attacks was directed at Iran’s nuclear centrifuges, where the attackers used the simple “Stuxnet” virus to harm more than 20% of their installations. Under the cover of different agendas, an Iranian hacking group also recently went on a cyber-attacking spree dubbed “Operation Cleaver”, which ultimately damaged essential government infrastructure in more than 16 countries.

According to experts, this is only the beginning. Through a research conducted by the Pew Research center, 61% of them believed that a well-planned large-scale cyber-attack will be successfully orchestrated before 2025, and consequently severely harm the nation’s security. With such threats looming, it is essential for the government to implement the most efficient developing security technologies into their cloud. Some of the current top trends include:

Improved Access Control

Many of the successful attacks sail through because of poor access controls in the targeted data centers. Although not a government corporation, Sony’s recent problems, which even drove the government to intercept, were caused largely due to careless password and username usage. To prevent such attacks, the government organizations are now opting for advanced authentication processes to access their cloud resources. In addition to the standard two-factor authentication which grants access after verifying IP and username, the organizations are now implementing biometrics and secondary devices verification in their access control architecture.

Sophisticated Encryption

To make data useless to hackers when they infiltrate data centers or spoof during transmission, government organizations have been encrypting their data. Unfortunately, this has proven ineffective to hackers who steal decryption keys or use sophisticated decryption algorithms to unfold and obtain data. To prevent future recurrences, government organizations are stepping up their data-at-rest and data-in-transit encryption systems.

Through the years, they have been using two factor encryption systems where cloud servers and endpoint user hold the encryption keys. This is gradually changing thanks to automated encryption control which get rid of the user factor in the equation. Instead of distributing encryption keys to the individual users, the systems use array-based encryption which fragments the data during storage and transmission. The meaningless fragments are transmitted individually and can only be fully defragmented into meaningful data if the server or endpoint device detects all the fragments. Therefore, hackers can only spoof on meaningless data fragments.

Digital Risk Officers

According the Gartner Security and Risk Management Summit of 2014, the year 2015 will see a proliferation of digital risk officers. In addition to tech officers, enterprises and government organizations will now hire digital risk officers to critically assess potential risks and strategize on cloud and data security.

This has been necessitated by continued expansion of the government digital footprint, whereby its organizations are now widely integrating their systems with employee BYOD to improve service delivery. As the network and infrastructure grows, so do the risks- which now require dedicated experts to prevent them from developing into successful attacks. With the trend only picking up in 2015, Gartner predicts it to exponentially grow over the next few years depending on the expanding networks of various organizations. In 2017, the adoption of DROs by government organizations is expected to be at 33%.

Multi-Layered Security Framework

Since the cloud systems are composed of various elements which face different threats, government organizations are protecting their data through tiered multi-layered security frameworks. For a hacker to gain access to any of the systems, he has to first go through a sophisticated security model composed of several detection, resistance, defense and tracking layers.

In addition to the network firewalls, the government is using virus-detection and anti-spyware on its servers and storage systems to comprehensively protect server operating systems, endpoint devices, file systems and databases and applications.

Big Data Security Analytics

“Without big data analytics, companies are blind and deaf, wandering out onto the web like deer on a freeway”- Geoffrey Moore, author of Crossing the Chasm, indicated as he emphasized the need of implementing big data analytics in all the relevant departments in an organization, especially the web.

Government organizations are directly adopting this principle by embedding critical big data security analytics in their security frameworks. This allows them to continuously monitor data movement, exchange and potential vulnerabilities which hackers and malware could capitalize on. Additionally, data that is generated is comprehensively analyzed to gather intelligence on internal and external threats, data exchange patterns, and deviations from normal data handling. Due to its efficacy in analyzing potential threats and blocking them, Gartner predicts that 40% of organizations (both government and non-government) will establish such systems in the next five years.

Although no strategy is regarded absolute and perfect, these current trends are expected to streamline the cloud sector and offer government organizations increased security compared to previous years. This, with time, is expected to significantly reduce the number of successful attacks orchestrated on governmental cloud resources.

 Author: Davis Porter
Image Courtesy: Stuart Miles, Freedigitalphotos.net

Why Government Institutions are Increasingly Switching To Private and Hybrid Cloud

government hybrid cloud
The year 2010 proved to be revolutionary to government CIOs since it oversaw the passing of the US Cloud First Policy, which significantly affected federal agencies’ IT frameworks. It was principally aimed at increasing flexibility and reducing IT costs by leveraging government-optimized cloud solutions. Since then, government CIOs have been comprehensively reevaluating their IT infrastructures and budget to include the cloud and subsequently improve agency operations and service delivery.

Due to limited capabilities and increased risk within the public cloud, most government institutions have since integrated the private cloud within their computing architectures to form stable and efficient hybrid cloud systems. A survey conducted by Meritalk on more than 150 government IT professionals found that although private cloud is the most preferable option, agencies still leverage the public cloud particularly for public-focused operations- consequently, over 30% of them have merged the two into hybrid cloud to concomitantly optimize agency and public-focused operations.

With about 56% of federal institutions currently drawing up plans to move to the cloud, the hybrid still remains the favorite due to the following reasons:

Security

Annual reports prepared and produced by the Identity Theft and Research Centre suggest that cloud data centers are consistently targeted and intermittently attacked by malware and hackers, making them the most threatened data centers. With the recent data breach trends further substantiating this point, federal CIOs have grown exceedingly afraid of the public cloud. Through another Meritalk survey, 75% of them reported that they were hesitant of moving to the cloud to due data security concerns. Additionally, 30% of them were restricted from migrating their data centers to the cloud by the US data security compliance requirements on sensitive governmental data.

The most efficacious solution to overcome these issues is the hybrid cloud- While some applications run on public cloud environments, the most sensitive ones are restricted within secure private cloud data centers. Compared to the former, the latter solution is definitely more secure because of controlled access and user-defined data protection strategies. The hybrid cloud therefore allows government institutions to reap the benefits of the public cloud while enjoying data security privileges of the private cloud.

Cost Efficiency

Cost efficiency is arguably the biggest trending word in most government organizations today. The government is permanently under pressure from US citizens to reduce operational costs and account for every dollar spent within federal agencies. In reaction to this, the government enacted the Federal Data Center Consolidation Initiative to reduce IT expenditure by merging data centers through sustainable and effective cloud solutions- ultimately cutting costs on manpower, hardware, software and underutilized real estate.

Of course for most government institutions, migrating entirely to the public cloud would be detrimental to their operations. Therefore, to reduce costs and maintain optimal agency processes, they have connected dedicated or private cloud resources to public components- Ultimately reducing IT costs by 17%.

Scalability

Government institutions are not static- they keep changing to adopt to demanding and dynamic societal needs. Some agencies develop exponentially until they are split into several factions with defined responsibilities. Others shrink and gradually become redundant until they are dissolved. To adopt to such fluid environments, standard in-house IT resources would definitely require drastic infrastructural changes.

With the hybrid cloud, agencies can effortlessly adjust their public cloud resources to accommodate changes while they maintain sensitive information within their private infrastructures. In case of agency splitting for instance, cloud resources are divided accordingly between the resultant factions while private in-house information is shared.

Service Delivery

The principal objective in federal agencies has always been public service. While some do this indirectly, most of them serve the public directly through citizen interaction. Service delivery for such institutions has been a major problem due to a large disparity between the number of agents and citizens. Over the previous years, many technological inventions have been implemented to overcome this problem by improving speed and efficiency, ultimately boosting service delivery.

Of all the IT solutions implemented, nothing has proven to be as phenomenal as the private plus public hybrid cloud. While services are being contemporaneously delivered to different citizens through public cloud, sensitive agency information and processes are supported by private cloud. This has virtually reduced agent to citizen ratio and consequently boosted service delivery.

Asset Utilization

For optimal service delivery, government institutions always have to effectively utilize close to 100% of their IT resources. A simple bug or hardware failure could cause a major setback, subsequently backlogging most of an agency’s operations. A complete system failure is even more detrimental since it could potentially cripple service delivery and trigger immense socio-economic losses. Case in point are the healthcare.gov breaches which occurred in 2013 and 2014, partly crippling the US federal healthcare data systems and consequently deterring efficient service delivery.

For many years, such system failures were hardly manageable since fallback plans mainly consisted of analogue systems of paperwork. With the hybrid cloud however, optimal asset utilization allows federal institutions to switch between different cloud resources according to their availability, efficacy and suitability in handling different processes. If an in-house software fails for instance, one can simply temporarily shift to public cloud resources to proceed with the processes as maintenance and repair is done on the affected resources. Hybrid cloud systems are therefore widely considered in the implementation of strategic data loss disaster management frameworks.

Due to these significant benefits, the governmental cloud curve is considered to be steepest on the private and hybrid cloud as more institutions rush to adopt it over the next few years. Although many are considering trying out the public cloud first before linking it to their private servers, the growth of hybrid cloud among government and large corporations will see it hit 50% by 2017.

FileCloud is a leading private cloud solution that can substitute public cloud or augment public cloud to create a hybrid solution. Here is an example where FileCloud helped Indian Government to launch their first private cloud solution .

Author: Davis Porter
 

Image courtesy: Stuart Miles, Freedigitalphotos.net

Take a tour of FileCloud

HIPAA 101 – An introduction to HIPAA 

HIPAA Guidance

 

HIPAA, otherwise known as the Health Insurance Portability and Accountability act, which was first introduced in 1996, demanded that the department for human services and health in the U.S. (HHS) should create specific regulations  to protect the security and privacy of health information. In order to properly fulfill this new requirement, the HHS published the HIPPA security rule, and the HIPAA privacy rule. The privacy rule, otherwise referred to as the standards for privacy of individual health information, establishes the standards that should be used nationally to protect health information. The security rule addresses the non-technical and technical safeguards that covered entities needed in place to ensure individuals’ information (or e-PHI) remains secure.

Within the HHS, the office of civil rights is responsible for enforcing the security and privacy rules, utilizing voluntary compliance activities, and penalties. Before HIPAA was introduced, there was no widely accepted set of general requirements or security standards available for protecting the health information that exists in the care industry. However, new technologies have continued to evolve, and the health care industry has started to move away from the process of using paper, to rely more heavily on utilizing electronic information systems to answer eligibility questions, pay claims and conduct various other clinical administrative functions.

HIPAA today

Currently, providers are using clinical applications such electronic health records, computerized physician order entries, and electronic pharmacy, laboratory and radiology systems. Health plans more regularly provide access to care management and claims, as well as various self-service options, meaning that the workforce in the medical industry has become more efficient and mobile. However, the rising online adoption has increased the potential security risks that are emerging.

One of the primary goals of the security rule is to ensure that individuals’ private health information remains secure, while allowing certain entities to engage with new technologies and improve the way the patient care can work. Because the marketplace in healthcare is so vast and diverse, the security rule needed to be versatile and flexible enough to give covered entities access to policies, technologies and procedures appropriate for that entity’s size and organizational structure. At the same time, it has to make sure it doesn’t limit innovations that help the industry and help in its cause to keep electronic healthcare information of patients private.

Like many simplification rules regarding Administration, the Security Rule applies to health care clearinghouses, health plans, and providers of healthcare who transmit information and data about health in electronic form in combination with a transaction for which standards have been adopted under HIPAA.

The Information that Is Protected

The HIPAA privacy rule is used to protect the privacy of individual health information, known as protected health information. The security rule, on the other hand, protects the subset of that information covered by the privacy rule, which can be any individual health information created, maintained, received or transmitted by a covered entity in an electronic way.

The security rule means that any covered entity must maintain the appropriate technical, physical and administrative safeguards established for protecting personal information. Covered entities need to ensure that all of the e-PHI they create, maintain, transmit or receive is confidential, and maintains its integrity. They must also take steps to identify potential threats to the security of that information, and protect it against problems.

The Security Rule and Confidentiality

According to the security rule, confidentiality can be defined as e-PHI that is not made available or disclosed to people who are not authorized to view it. The confidentiality requirements of the security rule directly support the privacy rule when it comes to improper disclosure and use of personal healthcare information. The security rule is also used to promote further goals of maintaining the availability and integrity of e-PHI. Beneath the security rule, integrity refers to the fact that personal healthcare information in an electronic medium cannot be destroyed or altered without authorization. Availability suggests that the e-PHI is usable and accessible on demand by any person who is authorized.

One important thing to remember about the HHS, is that it recognizes covered entities can range from incredibly small providers, to large nation-wide health plans. Therefore, the rule regarding security is scalable and flexible to ensure that covered entities are still able to assess their own needs, and create solutions that are appropriate to their particular environments. The rule will not dictate exact measures, but forces the entity to consider certain key factors, including:

  • The cost of security measures
  • The complexity, capability and size of the entity
  • The possible risk or impact to e-PHI
  • The software, hardware, or technical infrastructure.

Self hosted cloud such as FileCloud could help organizations in health care industry meet HIPAA standard. Here is a great example of how FileCloud helped Precyse to provide cloud features, while meeting HIPAA standards.

 Author: Rahul Sharma

image courtesy: Stuart Miles,  freedigitalphotos.net

Why Government Should Focus On e-Discovery Tools

ediscovery

Would it surprise you to know that 93 percent of all documents in the world are created electronically, of which 70 percent never even go through a hard copy conversion? this fun fact is not from some 2015 survey, but from a study conducted in 2013!
Yes, the age of digital information dawned on us a long time ago, but we have barely scratched the surface of potential for empowerment from technology. Information harvesting is obviously the next great frontier for the tech industry, which is why the field of electronic discovery (e-discovery) has gained quite a bit of prominence over the past few years.

What is e-Discovery?
E-discovery essentially covers the searching, locating, and securing of data from a digital archive with the intent of using it as evidence to build a strong civil or criminal legal case by the government. It can be carried out offline or through a specific network. This court-ordered hacking practice can help governments stay one step ahead of cybercriminals and bust complex illegal operations through the intelligent analysis of data.

How Governments Can Benefit From e-Discovery
For one thing, no one would ever chose a tower of paperwork stuffed over a laptop. The manual scrutiny and resources involved in managing paperwork are simply ludicrous when compared to digital data. Furthermore, digital files can also be recovered and un-deleted once they have made their way into a network and appeared on multiple hard drives. E-discovery tools allow you to parse through all kinds of data for examination in civil or criminal litigation, including:

  • E-mail
  • Images
  • Calendar files
  • Databases
  • Spreadsheets
  • Audio files
  • Animation
  • Web sites
  • Computer programs
  • Malware

You may have heard the term ‘cyberforensics’ being thrown around casually on a CSI episode at some point. What most people are unaware of that it is merely a specialized version of e-discovery used to create a digital copy of the suspect’s computer hard drive for extraction of evidence while preserving the original hardware at a secure location.
ediscovery referenc model
source: edrm.net

The necessity of such instruments has been acknowledged by governments worldwide as they have been consistently passing amendments and new bills to meet the challenges of administration in the 21st century.  The defense and prosecution departments in particular have helped the e-Discovery market blossom to $1.8 billion in 2014 and an expected to grow to $3.1 billion by 2018.

Professor Richard Susskind, also known as Moses to the Modern Law Firm in his circles, noted in his law firm technology predictions that many legal firms will move their data and processing to the cloud and e-discovery tools can play a big role in adding fuel to the fire by revolutionizing the process of investigation.

Although e-discovery tools are unquestionably a gamechanger for litigation cost management, there are a variety of legal, constitutional, political, and personal roadblocks that can challenge the efficacy of electronically stored information.

In the seventh annual benchmarking study of e-discovery practices for government agencies conducted by Deloitte, it was found that nearly 75 percent of respondents felt that they lacked adequate technical support when dealing with opposing counsel.

The handling, processing, reviewing or producing electronically stored information in compliance with the Federal Rules of Civil Procedure is a challenge that gives even the most tech-savvy legal team sleepless nights.

How the Future of e-Discovery is Only Going to Get Brighter

While there are no straightforward solutions to these tribulations, it does not mean that agencies have to resign themselves to the mercy of limitation.
Cutting-edge big data technology is booming, and dedicating resources to integrate better search and review technologies, visualization tools, extensive ESI harvesting mechanisms, etc., can address these issues rather competently. Let us take a look at some of the technologies that can help take e-discovery to the next level:

Predictive Coding

While near-duplication techniques, clustering, and email threading helps organize the massive volume of data extracted from the documents under review, they do not offer much responsiveness or clarity in communicating the relevance of data to match the user’s needs.
Predictive coding helps pinpoint more accurate electronically stored information by rating documents on the degree of similarity in concept and terms, which is a far superior and robust analytical technique compared to other options.

Predictive coding can also be used to classify important documents vital in establishing claims or defenses.  This helps in the constant refinement of the search and review process to pass more strategic information to users.

Data Visualization

Infographics and other forms of visualized content are special because they make an impact on our understanding by stimulating our natural instincts of patternicity (the human tendency to identify patterns).

Visualization tools utilize machine learning and analytics to identify anomalies and build a relationship web with the information available to help review teams locate privileged materials more easily.

Effective Employee Training

Government agencies must maintain the highest standards of data investigation and processing to ensure their case is not flawed with irregularities or irrelevant information. By hiring a skilled review team, agencies can enforce much needed quality checks to guarantee the fulfillment of review goals and streamline workflows by eliminating any redundancies in this process.

With increased pressure from sequestration cuts and an ever-expanding pile of pending cases to tackle, e-discovery technology is clearly the way forward. What matters now is how fast agencies can train their staff to catch up with the technological boom and develop the necessary skill sets to harvest precious insights from an ocean of information.

 
Author: Prashant Bajpai
 

Image courtesy: Stuart Miles/ freedigitalphotos.net/

The Ultimate Guide to HIPAA Compliance

compliance
With governments ramping up their efforts to clamp down on unscrupulously liberal data industry practices to combat the growing risks of identity theft and privacy violations, it is safe to say the digital information industry is not the Wild, Wild West anymore.

Data governance in the healthcare industry is one of the areas that have witnessed a lot of scrutiny and revaluation of policies over the past few years as organizations scrambled to comply with HIPAA standards and ensure the security of protected health information (PHI).

But here’s a question – why does the federal Health Insurance Portability and Accountability Act still give nightmares to people despite being launched in 1996?

To understand this, you must first consider how the advent of advanced digital data storage and distribution platforms has raised the stakes for the healthcare industry to manage sensitive healthcare information, facilitate insurance cases, and control administrative costs like never before. Unfortunately, healthcare institutions have never faced such tremendous pressure in keeping up with technology before as well as dealing with complex unforgiving regulations at the same time.

ePHI is merely PHI that is stored or transmitted electronically (i.e. via email, text message, web site, database, online document storage, electronic FAX, etc

According to NIST guidelines, an individual or team in the organization must be tasked with the responsibility for ensuring HIPAA compliance and accepting business risk.

Once a clear business owner is established, a cross-disciplinary group, including the technical, legal, and HR departments must work in tandem to ensure that the policies are appropriately defined appropriately and correctly implemented to fulfill the objectives of the data governance strategy.

According to the HHS, no authority can grant any kind of certification to prove your organization is HIPAA compliant. HIPAA compliance is an ever-evolving process for organizations instead of a one-time landmark event, which is why deploying a tool to aggregate compliance metrics for tracking your documentation storage and sharing policies can come in handy.

Given the vitality of protected health information (PHI) and HIPAA compliance, let us dig a little deeper and highlight the four key rules you will need to address:

  1. HIPAA Privacy Rule
  2. HIPAA Security Rule
  3. HIPAA Enforcement Rule
  4. HIPAA Breach Notification Rule

The general guidelines of the HIPAA Security Standard demonstrate a technology-neutral approach, which means that there are no biased leanings towards certain technological systems or cloud services as long as the data protection regulations are being addressed seriously. However, the HIPAA language does introduce two classes of policy standards for the consideration of data management teams:

  1. Required (R) – mandatory compliance standards
  2. Addressable (A) – should be implemented unless detailed risk assessments can prove that their implementation is not required or favorable to their business setting.

HIPAA Administrative Requirements

Addressable Required
Employee Oversight: (A) Employ procedures to permit, revoke access, and administer employees working with protected health information. Risk Analysis: (R) Execute a risk analysis program to evaluate the key areas of storage and utilization of PHI to identify vulnerable points in the system.
ePHI Access: (A) Implement procedures for providing employee access to PHI and document all services that grant access to ePHI. Risk Management: (R) Implement actions adequate enough to mitigate risks to a suitable level.
Security Reminders: (A) Occasionally provide security and privacy policy updates and reminders to employees. Sanction Policy: (R) Put into practice sanction policies for employees who violate compliance standards.
Protection against Malware: (A) Implement procedures to detect and safeguard against malware. Information Systems Activity Reviews: (R) Recurrently examine system activity and logs.
Login Monitoring: (A) Report all system logins and track any discrepancies observed. Officers: (R) Appoint HIPAA Security and Privacy Officers
Password Management: (A) Create secure and easily accessible protocols for password modification and recovery. Multiple Organizations: (R) Ensure unauthorized access by third party or parent organizations is warded off.
Contingency Plans Updates and Analysis: (A) Establish periodic testing routines and regular examination of contingency plans. Response and Reporting: (R) Track and document all security-related actions.
Contingency Plans: (R) Create accessible PHI backups for easy restoration of stolen or lost data.
  Emergency Mode: (R) Establish contingency protocols to guarantee the unhindered workflow of critical business processes remains unaffected.
  Evaluations: (R) Carry out episodic evaluations to ensure your data governance is in compliance with all the latest laws and regulations.
  Business Associate Agreements: (R) Establish special Omnibus-compliant contracts with business partners who can access your facility’s PHI in order to guarantee their compliance.

 

HIPAA Physical Requirements

Addressable Required
Contingency Operations: (A) Implement procedures that provide facility access in support of restoration of lost data as part of your organization’s disaster recovery and emergency plan. Workstations: (R) Apply policies to control the configuration of software on systems. You can also safeguard all workstations by restricting admission to authorized users.
Facility Security: (A) Implement policies and procedures to protect all ePHI stored on facility. Devices and Media Disposal and Re-use: (R) Take measures for the secure disposal and reuse of all media and devices handling ePHI.
Access Control and Validation: (A) Initiate measures to control and validate individual authentication based on their designation as well as control access to testing software.
Media Movement: (A) Log all hardware movements related to ePHI storage.

 

HIPAA Technical Requirements

Addressable Required
Automatic Logoff: (A) Establish an automatic suspension point of electronic procedures after a fixed period of inactivity. Unique Tracking: (R) Allocate a unique name/number for the purpose of identifying user identity.
Encryption and Decryption: (A) Set up an electronic apparatus to encrypt and decrypt ePHI as and when necessary. Audit Controls: (R) Employ hardware and software tracking gateways to track and record all activity pertaining to the use of ePHI.
ePHI Integrity: (A) Apply policies to safeguard ePHI from unsanctioned revision or damage. Authentication: (R) Implement protocols to verify authentication of all personnel seeking access to ePHI.
Transmission Security: (A) Execute technical security measures to prevent unlawful access to sensitive ePHI being transmitted over a network.

 

Click here to explore how on-premise cloud such as FileCloud help you to comply with various data governance regulations.

Author: Prashant Bajpai

image courtesy: Stuart Miles/ freedigitalphotos.net

How to Pick the Right Storage Solutions for Government Agencies

Government Data Storage

Apart from businesses and corporations, government agencies are other major consumers of storage solutions nationwide. Before the Federal Data Center Consolidation Initiative, government agencies largely relied on in-house storage systems that compromised of thousands of data storage drives in unconsolidated warehouses. To establish a new branches, government agencies had to build new data centers to service them. Ultimately, hundreds of data centers were distributed all over the country, wasting tax-payers money which would otherwise have been used to improve critical services.

After the president insisted on the need to positively spend every federal dollar, the FDCCI was enacted to facilitate shrinkage of the data centers to reduce the expenditure on underutilized real estate, software, hardware and manpower. Since then, government agencies have been gradually adopting transformational storage solutions like the cloud to consolidate their data, improve their operations and overall efficiency. According to a MeriTalk report titled “The FDCCI Big Squeeze”, more than 60% of government IT managers can now comfortably assign their staff other more critical tasks thanks to the consolidation initiative. The report further indicates that 57% of government agency managers are saving costs on energy consumption, subsequently utilizing the funds in other critical operations. This directive has therefore largely revolutionized government storage systems.

Although the directive applies to all the government agencies, the precise storage solutions depend on the needs of the respective agencies. The storage principles may be similar but the framework and architecture is different in various agencies. That’s why it’s critically imperative for government agents to comprehensively understand how they can evaluate their respective needs to pick the right storage solutions for their agencies. Here are fundamental points to consider:

Type of Data

Although it’s basically a system of 1’s and 0’s, all data is not equal. While some, for instance, may be significantly confidential, other types of data may be open to public access. Additionally, some agencies deal in fairly huge amounts of data on a regular basis while others are confined to small data sets that correspond to their small scale operations.

The answers to following questions should be helpful in evaluating the type of data your agency deals with:

  • Is it public or confidential?
  • How secure should the data be?
  • How much time do you need to retain the data?
  • What type of access is required?
  • Is it large or small scale?

 

The type of storage system you choose for your government agency should be convenient and suitable for your type of data. The NSA for instance, uses a fairly expansive, consolidated data center built in Utah that’s very effectual for thousands of zettabytes or yottabytes of confidential data. Smaller government agencies on the other hand, use small scale data centers optimized for their operations.

Scalability

Data is not static- It keeps growing and changing. Similarly, government agencies keep expanding and shifting to new areas to boost their operations. Since it’s considerably expensive to purchase new hardware to accommodate growth, it’s advisable to use a storage solution that’s flexible enough to accommodate steady expansion.

Government agencies that have already consolidated their data in the cloud are already enjoying its scalability and increased performance. They can efficaciously upload and process data, then scale up the storage space and performance according to their fluid requirements.

Compliance Levels

To optimize service delivery, the government has implemented various compliance levels in its departments, ranging from data handling to inter and intra-departmental communication. Some of the agencies like security and finance are more regulated than the rest because of the sensitivity of the data they handle.

To ensure complete adherence to the compliance levels, you should critically assess them and compare them to your storage options. If you are outsourcing the data to a third party cloud service provider for instance, ensure that he complies with the respective compliance levels and has the relevant credentials as required.

Accessibility

Due to the sensitive nature of governmental data, accessibility is often controlled with the most confidential data only accessible to individuals with the highest credentials. The 1 million square feet NSA data center in Utah for instance, is secured by high perimeter walls and is only open to high ranking NSA officials. The rest of the agents access data according to their clearance levels through the NSA network that links the database to remote computers.

Accessibility should therefore be a primary concern as you consider the right storage solution. How many people will access the data? Which framework will be implemented to control access according to the credentials? Is the data accessible to third party storage service providers?

Although access control is most critical to sensitive data, all governmental data access should be systematically regulated to avoid data loss, and any other perils which may arise due to unauthorized access and distribution.

Security

According to the Global Information Assurance Certification, cyber warfare is exceedingly developing to be the primary combat tactic of this century. The United States is increasingly facing cyber threats, with a majority of the attackers targeting governmental organizations. Infiltrating the organizations would leak out United States secrets, as was seen with the WikiLeaks diplomatic crisis, which put the State department under immense pressure after leaking out thousands of confidential cables. Additionally, attackers are seeking to cripple the organizations to cut off some of the basic services delivered to the American citizens.

The best and most effectual strategy of protecting your organization from such attackers is securely storing all the data in servers protected by multiple levels of impenetrable security protocols. Of course such protocols comprise of up-to-date firewalls, anti-malware, anti-virus and anti-hacking systems that can detect and repel all types of cyber-attacks. Additionally, physical protection should be provided to prevent physical access to the respective data centers and servers.

As you assess the various critical elements to determine the right storage solution, it’s advisable to engage the experts who will advise you further on how to implement them. You should also to do a keen analysis of the existing storage systems successfully implemented by other governmental organizations to have a vivid comprehension of exactly what you need.  Finally, ensure that all the data storage systems are consolidated and virtualized such that they are all remotely accessible to all relevant stakeholders.

In addition to picking the right storage, governments need a good on-premise cloud solution. Many government organizations from NASA to government of India trust FileCloud to run their on-premise cloud. Here are some case studies:

Author: Davis Porter
Image Courtesy: Vichaya Kiatying-Angsulee/freedigitalphotos.net