Archive for the ‘data governance’ Category

GDPR – Top 10 Things That Organizations Must Do to Prepare

May 25, 2018 – that’s probably the biggest day of the decade for the universe of data on the Internet. On this date, Europe’s data protection rules –  European General Data Protection Regulation (GDPR) – becomes enforceable. In 2012, the initial conversations around GDPR began, followed by lengthy negotiations that ultimately culminated in the GDPR proposal. At the time of writing this guide (Sep 2017), most European businesses have either started making first moves towards becoming compliant with GDPR, or are all set to do so. Considering how GDPR will be a pretty stringent regulation with provisions for significant penalties and fines, it’s obvious how important a topic it has become for tech-powered businesses.

Now, every business uses technology to survive and thrive, and that’s why GDPR has relevance for most businesses. For any businessman, entrepreneur, enterprise IT leader, or IT consultant, GDPR is as urgent as it is critical. However, it’s pretty much like the Y2K problem in the fact that everybody is talking about it, without really knowing much about it.

Most companies are finding it hard to understand the implications of GDPR, and what they need to do to be compliant. Now, all businesses handle customer data, and that makes them subject to Data Protection Act (DPA) regulations. If your business already complies with DPA, the good news is that you already have the most important bases covered. Of course, you will need to understand GDPR and make sure you cover the missing bases and stay safe, secure, reliable, and compliant in the data game. Here are 10 things businesses need to do to be ready for GDPR.

Top 10 things that organizations should do to prepare and comply with GDPR

1.      Learn, gain awareness

It is important to ensure that key people and decision makers in your organization are well aware that the prevailing law is going to change to GDPR. A thorough impact analysis needs to be done for this, and any areas that can cause compliance issues under GDPR needs to be identified. It would be appropriate to start off by examining the risk register at your organization if one exists. GDPR implementation can have significant implications in terms of resources, particularly at complex and large organizations. Compliance could be a difficult ask if preparations are left until the last minute.

2.      Analyze information in hand

It is necessary to document what personal data is being held on hand, what was the source of the data, and who is it being shared with. It may be necessary for you to organize an organization-wide information audit. In some cases, you may only need to conduct an audit of specific business areas.

As per GDPR, there is a requirement to maintain records of all your activities related to data processing. The GDPR comes ready for a networked scenario. For instance, if you have shared incorrect personal data with another organization, you are required to inform the other organization about this so that it may fix its own records. This automatically requires you to know the personal data held by you, the source of the data and who it is being shared with. GDPR’s accountability principle requires organizations to be able to demonstrate their compliance with the principles of data protection imposed by the regulation.

3.      Privacy notices

It is important to review the privacy notices currently in place and put in a plan for making any required changes before GDPR implementation. When personal data is being collected, you currently need to provide specific sets of information such as information pertaining to your identity and how you propose to use that information. This is generally done with a privacy notice.

The GDPR requires you to provide some additional information in your privacy notices. This includes information such as the exact provision in the law that permits asking for that data and retention periods for the data. You are also required to specifically list that people have a right to complain to the ICO if they believe there is a problem with the way their data is being handled. The GDPR requires the information to be provided in the notices in easy to understand, concise and clear language.

4.      Individual rights

You should review your procedures to confirm that they cover all the individual rights set forth in the GDPR. These are the rights provided by the GDPR.

  • To be informed
  • Of access
  • To rectification
  • To erasure
  • To restrict processing
  • To data portability
  • To object
  • To not be subject to automated profiling and other such decision-making

This is an excellent time to review your procedures and ensure that you will be able to handle various types of user requests related to their rights. The right to data portability is new with the GDPR. It applies:

  • To personal data provided by an individual;
  • When processing is based on individual consent or to perform a contract; and
  • Where processing is being done by automated methods.

5.      Requests for Subject access

You would need to plan how to handle requests in a manner compliant with the new rules. Wherever needed, your procedures will need to be updated.

  • In most of the cases, you will not be allowed to charge people for complying with a request
  • Instead of the current period of 40 days, you will have only a month to execute compliance
  • You are permitted to charge for or refuse requests which are apparently excessive or unfounded
  • If a request is refused, you are required to mention the reason to the individual. You are also required to inform them that they have the right to judicial remedy and also to complain to the correct supervising authority. This has to be done, at the very latest, within a month.

6.      Consent

It is important to review how you record, seek and manage consent and if any changes are required. If they don’t meet the GDPR standard, existing consents need to be refreshed. Consent must be specific, freely given, informed, and not ambiguous. A positive opt-in is required and consent cannot be implied by inactivity, pre-ticked boxes or silence. The consent section has to be separated from the rest of the terms and conditions. Simple methods need to be provided for individuals to take back consent. The consent is to be verifiable. It is not required that the existing DPA consent have to be refreshed as you prepare for GDPR.

7.      Aspects related to children

It would be good if you start considering whether systems need to be put in place in order verify the ages of individuals and to get consent from parents or guardians for carrying out any data processing activity. GDPR brings in specific consent requirements for the personal data of children. If your company provides online services to children, you may need a guardian or parent’s consent so as to lawfully process the children’s personal data. As per GDPR, the minimum age at which a child can give her consent to this sort of processing is set to 16. In the UK, this may be lowered to 13.

8.      Aspects related to data breaches

You should ensure that you have the correct procedures necessary to investigate, report, and detect any breaches of personal data. The GDPR imposes a duty on all companies to report specific types of data breaches to the ICO, and in some situations, to individuals. ICO has to be notified of a breach if it is likely to impinge on the freedoms and rights of individuals such as damage to reputation, discrimination, financial loss, and loss of confidentiality. In most cases, you will also have to inform the concerned parties directly. Any failure to report a breach can cause a fine to be imposed apart from a fine for the breach by itself.

9.      Requirements related to privacy by design

The GDPR turns privacy by design into a concrete legal requirement under the umbrella of “data protection by design and by default.” In some situations, it also makes “Privacy Impact Assessments” into a mandatory requirement. The regulation defines Privacy Impact Assessments as “Data Protection Impact Assessments.”’ A DPIA is required whenever data processing has the potential to pose a high level of risk to individuals such as when:

  • New technology is being put in place
  • A profiling action is happening that can significantly affect people
  • Processing is happening on a large set of data

10.  Data protection officers

A specific individual needs to be designated to hold responsibility for data protection compliance. You must designate a data protection officer if:

  • You are a public authority (courts acting in normal capacity exempted)
  • You are an institution that carries out regular monitoring of individuals at scale
  • You are an institution that performs large-scale processing of special categories of data such as health records or criminal convictions

Many of GDPR’s important principles are the same as those defined in DPA; still, there are significant updates that companies will need to do in order to be on the right side of GDPR.

Author: Rahul Sharma

Sources

https://ico.org.uk/media/1624219/preparing-for-the-gdpr-12-steps.pdf

https://ico.org.uk/for-organisations/data-protection-reform/overview-of-the-gdpr/

 

 

Types of Controls to Manage Your Business Data in an EFSS

EFSS Data Controls

In 2015, there were 38% more security incidents than 2014, and an average cost per stolen record – containing sensitive and confidential data – of $154 (the healthcare industry payed the most, at $363 per record). Worse still, even when 52% of IT professionals felt that a successful cyber-attack against their network would take place in the year, only 29% of SMBs (fewer than 2014), used standard tools like patching and configuration to prevent these attacks.

The consequences of poor data security and data breaches in the cloud cannot be overstated. A look at these statistics shows the effect of data insecurity and data breaches in the cloud are a road that no business wants to take. All the aforementioned statistics show the lack of control of data in the cloud, so we will first look at who controls data in the cloud, followed by how to manage business data in an EFSS.

Who controls data in the Cloud?

It is clear that your IT department does not know who controls data in the cloud, as revealed by participants of a Perspecsys survey on data control in the cloud. According to the results, 48% of IT professionals don’t trust that cloud providers will protect their data, and 57% are not certain of where sensitive data is stored in the cloud.

This issue is also closely tied to data ownership. Once data ownership changes, then we expect a change in the level of control users have on their data . To quote Dan Gray on the concept of data ownership: “Ownership is dependent on the nature of data, and where it was created”. Data created by a user before uploading to the cloud may be subjected to copyright laws, while data created in the cloud changes the whole concept of data ownership. It is no wonder that there is confusion on this matter.

Despite challenges such as half or no control of data stored in the cloud, there exist techniques that we can use to control business data in an EFSS, consequently preventing unauthorized access and security breaches.

Types of data control for business data in an EFSS

 

Input validation controls

Validation control is important because it ensures that all data fed into a system or application is accurate, complete and reasonable. One essential area of validation control is supplier assessment. For example, is a supplier well equipped to meet a client’s expectations? With regards to controls to ascertain data integrity, security and compliance with industry regulations as well as client policies. This activity is best carried out using an offsite audit in the form of questionnaires. By determining the supplier system life-cycle processes, your team can decide if the EFSS vendor is worthy of further consideration or not. Additionally, the questionnaire serves as a basis to decide whether an on-site assessment will be carried out, based on the risk assessment. If carried out, the scope of an onsite audit will be dependent on the type of service an EFSS vendor provides.

Service level agreements should also be assessed and analyzed to define expectations of both an EFSS vendor and user. Usually, this is also done to ensure that the service rendered is in line with industry regulations. Additionally, we must ensure that an EFSS provider includes the following in the service level agreement.

  • Security
  • Backup and recovery
  • Incident management
  • Incident reporting
  • Testing
  • Quality of service rendered
  • Qualified personnel
  • Alert and escalation procedures
  • Clear documentation on data ownership as well as vendor and client responsibilities
  • Expectations with regards to performance monitoring

Processing controls

Processing control ensures that data is completely and accurately processed in an application, via regular monitoring of models and looking at system results when processing. If this is not done, small changes in equipment caused by age or damage will result in a bad model, which will be reflected as wrong control moves for the process.

Include backup and recovery controls

Processing control ensures that data is completely and accurately processed in an application, via regular monitoring of models as well as looking at system results when processing. If this is not done, small changes in equipment caused by age or damage will result in a bad model, which will be reflected as wrong control moves for the process.

Identity and access management

Usually, Identity and Access Management (IAM) allows cloud administrators to authorize personnel who can take action on specific resources, giving cloud users control and visibility required to manage cloud resources. Although this seems simple, advancement in technology has complicated the process of authentication, authorization and access control in the cloud.

In previous years, IAM was easier to handle because employees had to log into one desktop computer in the office to access any information in the internet. Currently, Microsoft’s Active Directory and Lightweight Directory Access Protocol (LDAP) are insufficient IAM tools. User access and control has to be extended from desktop computers to personal mobile devices, posing a challenge to IT. For example, as stated in a Forrester Research report, personal tablets and mobile devices are being used in 65% of organizations, as 30% of employees provision their own software on these devices for use at work, without IT’s approval. It is no wonder that Gartner predicted in 2013 that Identity and Access Management in the cloud would be one of the sought-after services in cloud-based models, in just 2 years.

With this understanding, it is important to create effective IAM without losing control of internally provisioned resources and applications. By using threat-aware identity and access management capabilities, it should be clear who is doing what, what their role is and what they are trying to access.  Additionally, user identities, including external identities, must be tied to back-end directories, and sign-on capabilities must be single because multiple passwords tend to lead to insecure password management practices.

Conclusion

Simple assurance by an EFSS vendor that you have control of business data in the cloud is not enough. There are certain techniques that should be employed to make sure that you have a significant level of data control. As discussed, ensure that you have an effective identity and access management system, have processing and validation controls as well as business data backup and recovery options in place. Other important controls that we have not discussed include file controls, data destruction controls and change management controls.

Author:Davis Porter

Image Courtesy: jannoon028,freedigitalphotos.net

Sharing Large Medical Images and Files – Factors to Consider

ID-100414456

According to data collected by the HHS Office for Civil Rights, over 113 million individuals were affected by protected health information breaches in 2015. Ninety-nine percent of these individuals were victims of hacking, while the remaining 1 percent suffered from other forms of breach such as theft, loss, improper disposal, and unauthorized access/disclosure. A quick look at the trend from 2010 shows that health information data breaches are on the rise. An even deeper look at this report shows that network servers and electronic medical records are the leading sources of information breaches, at 107 million and 3 million, respectively.

Sadly, security is not the only issue that medics face when sharing medical records. A 2014 article in the New York Times explains the difficulty medics face when trying to send digital records containing patient information. While the intention is noble—to improve patient care coordination—doctors are facing problems with their existing large file sharing options.

To help doctors share files such as medical images in an easier and safer way, we will explore four factors that should be considered.

HIPAA Compliance

Medical records are sensitive and confidential in nature. This means that handling them should be guided by set industry policies, in this case, Health Insurance Portability and Accountability Act (HIPAA), for example. HIPAA is actually a response to security concerns surrounding the transfer and storage of medical records, in this case, images.

HIPAA places responsibility on medics and healthcare providers in general to secure patient data and keep it confidential. As a result, non-compliance could lead to legal action, which can be costly. Usually, HIPAA makes sure that all Personal Health Information (PHI) is covered, outlining more stringent rules on electronic PHI, mainly because a security breach is more likely to affect a larger number of patients, all at once.

It is a medic’s responsibility to ensure that the selected EFSS solution is HIPAA-compliant if you want to maintain patient trust, keep positive publicity, and avoid steep HIPAA fines imposed after a breach. In fact, the first time you commit an offense, HIPAA will charge approximately $50,000, a figure that escalates accordingly with each subsequent offense.

Encryption

This is the second level of security you should consider before settling on a large file sharing solution. As much as an EFSS service provider is HIPAA-compliant, you need to ensure that measures outlined in HIPAA are taken.

When you read about patients’ rights as outlined in HIPAA, specifically the ‘Privacy, Security and Electronic Health Records’, you will notice that information security is emphasized. For this reasons, medics should ensure that patient data is encrypted in order to prevent it from being accessed by rogue colleagues or professional hackers.

It is entirely important that all hospital departments—ranging from cardiology, imaging centers and radiology, among others—encrypt medical images and files to further protect patient privacy. Better still, encryption should be both at rest and on transit, and files should only be shared with authorized specialists and physicians as well as the patients themselves.

To further tighten security, these files should be encrypted with non-deterministic encryption keys instead of fixed ones, whose passwords can be hacked. The best thing about this technique is that even when faced with a security breach on the server side, hackers cannot access the encryption keys. Additionally, you can opt for EFSS solutions that offer client-side encryption alone, barring the service provider and its employees from accessing this information.

File Scalability

Compared to other medical records, medical images present a great challenge with regards to file size. It is actually reported that a significant number of sequences and images are an average of 300MB. Additionally, average file size for a standard mammography image and a 3D tomography image are 19MB and 392 MB, respectively. While these file sizes already seem too large, Austin Radiological Association (ARA) predicts that by 2024, annual data from its 3D breast imaging files will reach 3 petabytes. These facts expose the storage challenges that medics face.

A glance at the process of finding medical images for active cases, storing them, and archiving those of inactive cases shows the immense need for medics to find a reliable and convenient large file sharing solution that caters to these storage needs.

A weak server could get overwhelmed with data, progressively becoming inefficient and inept as more files are uploaded into the system. The best way to solve this issue is by using cloud-based services that automatically scale your files according to your needs. This way, you will upload more files in the server, significantly reducing hardware costs by approximately 50 percent, especially when this is done on the cloud as opposed to in-house. In addition to these perks, the cloud will allow you to share these images faster and more conveniently, saving both time and storage.

Technology Increases the Likelihood of Medical Errors

While technology helps solve issues such as security and storage, over-reliance could actually lead to medical errors, incidents that are dreadful to patients and medics as well. As reported by Eric McCann of Healthcare IT News, medical errors cost America a colossal $1 trillion each year, and 400,000 Americans die annually due to these preventable mistakes.

Even though the cloud has been paraded as a solution to reduce incidences of medical error, the need to be watchful and keen can never be overstated. Take, for example, the erroneous click of a mouse and mislabeling of data. A case study on Kenny Lin, MD, a family physician practicing in Washington, D.C., which is detailed in his 2010 piece in the U.S. News & World Report, shows us how easy it is to make a mistake with technology. Dr. Lin nearly made a wrong prescription by accidentally clicking on the wrong choice in his EMR system.

Now, what if you mislabeled a patient’s radiology report? Wouldn’t that start a series of misdiagnosis and treatment? Could you imagine the damage caused? It is for this reason that even when technology makes it easier to share large, sensitive files like medical images, you should counter-check and make sure that the file is labeled correctly and sent to the intended, authorized recipient.

The Way Forward

The sensitivity of medical files is eminent, and with data breaches on the rise, it is vital to ensure the privacy of all medical documents, including large medical images and files. To reduce the possibility of a data breach, any EFSS solution used to share these files should guarantee a reasonable level of file security and HIPAA compliance. In addition to that, its capacity to efficiently handle file sizes and offer easy access to these files should not be ignored. Lastly, as you remain cautious when feeding data into the system, create a safe backup for your data just in case of a data breach. By taking such precautions, medical files can be shared between all necessary parties easier and more safely.

Author: Davis Porter

Image courtesy: freedigitalphotos.net, stockdevil

Data Owner Responsibilities When Migrating to an EFSS

While it is easy to say and conclude that all data belongs to your organization, complications arise when the person accountable for data ownership has to be identified. Even when the IT department spearheads the process of processing, storing, and backing up data among other functions, the fact is that it does not own business data. Worse still, outsourced service providers do not own this data any more than the IT department does.

Who Owns Data? What are Data Owner’s Responsibilities?

In the cloud environment, a data owner is a business user who understands the business impact of a security breach that would lead to loss of data, integrity, and confidentiality. This responsibility makes the data owner very conscious of decisions made to mitigate and prevent such security incidents.

When migrating to an EFSS, business data owners should do the following:

Classify Data

Data classification has been extensively labeled as a remedy to data breaches. In essence, data classification helps to significantly reduce insider threats, which are reported to cause 43% of data breaches. Other than malicious employees, data breaches are a result of human error. Additionally, the growing data volume experienced by businesses makes it difficult to track data; hence, it is challenging to know where data is stored, who accesses it, and what they do with this information. By making sure that only authorized employees access certain information, the probability of a data breach is likely to reduce.

Clearly, the need for data classification has never been more evident. To properly classify data, a few measures should be taken by a business.

  • Have detailed “acceptable use” policies. All employees should internalize and sign these documents, which are then reviewed annually or as needed.
  • Make use of data classification technologies. When you train employees using a data classification technology, they will better understand the sensitivity of the data they are creating, storing, and sharing. Consequently, they will treat this data with the highest level of confidentiality and caution.
  • Understand industry regulations to classify data accordingly.
  • Once data is properly classified, apply appropriate access controls and continuously yet randomly monitor data activity to nab suspicious activities as soon as they are carried out.

Monitor Data Life Cycle Activities

When migrating to an EFSS, issues such as data retention and disposal should constantly be monitored by a business data owner. Simply put, how long will the EFSS solution retain your data and how long will it take to dispose of your data completely after you have deleted it? What happens to your data once your contract with the provider ends?

Before a business owner looks at an EFSS provider’s life cycle, he needs to understand the typical seven phases of data life cycle. From the first stage of data capture, data maintenance, data synthesis, data usage, data publication, or data archival to data purging, how safe is it? Who has access to it and how long is it retained in the EFSS?

When this data is stored, is it used and accessed by third-parties who, sadly, cause 63% of all data breaches? Is the EFSS data retention and disposal policy compliant with the law? For example, data retention requirements stipulated in the Health Insurance and Portability and Accountability Act (HIPAA) state that organizations that accept credit cards must adhere to a Payment Card Industry Data Security Standard (PCI DSS) data retention and disposal policy.

Understand Enterprise File Sync-and-Share (EFSS) Deployment Models, As a Way of Assessing Risks

Despite the existence of extensive advice on the best EFSS solutions that exist, business data owners need to gain some technical knowledge. How many EFSS deployment models do you know, for example? Since this is a pretty broad topic, we will briefly discuss three models.

Public Cloud EFSS

In addition to being fast and easy to set up, a public cloud could be cheaper in terms of both infrastructure and storage costs. However, public cloud EFSS might not be the best regarding data protection and security, leaving your company exposed and vulnerable to regulatory non-compliance. It is, therefore, important to analyze the security measures the public cloud has to offer before settling.

Private Cloud EFSS

Although private cloud is believed to be more expensive compared to the public cloud, the cost of ownership depends largely on the vendor and infrastructure choice (for example, FileCloud offers the lowest cost of ownership across public and private clouds). Private cloud EFSS is worthwhile regarding services and security offered. With an adoption rate of 77%, private cloud solutions such as FileCloud are better options. This opinion is attributed to the flexibility and control over where data is stored. Consequently, users can choose which regulations to comply with and have better control over a breach because the IT department can access all the files and monitor, protect, and salvage them, as opposed to a public cloud.

Hybrid Cloud EFSS

According to RightScale’s, “Cloud Computing Trends: 2016 State of the Cloud Survey,” hybrid cloud EFSS adoption rate is 71%. The success is believed to be the result of the ability to harness the positive attributes of both a public and private cloud all at once because, usually in a hybrid environment, some components will run on the premises while others run in the cloud. One great example of a hybrid model is an EFSS application that runs as Software as a Service (SaaS) while data is stored on the premises or at the discretion of the user company.

Closing remarks

It is the responsibility of a business data owner to ascertain that data will be kept safe and confidential before migrating to any EFSS solution. This person needs to be savvy with the advantages a chosen EFSS model offers, compliance with industry regulations, proper access and identity management, understand the EFSS data life cycle processes, and ensure that security measures such as data encryption and authentication processes are in place.

 Author: Davis Porter

 

Data ownership in the cloud – How does it affect you?

The future of the cloud seems bright, Cisco predicts that by 2018, 59% of cloud workloads will be created from Software As A Service (SaaS). While these statistics are optimistic, we cannot ignore a few concerns that stifle cloud adoption efforts, such as data ownership.

Most people would be inclined to say that they still own data in the cloud. While they may be right in some sense, this is not always the case. For instance, let us look at Facebook, which many people use as cloud storage to keep their photos. According to the Facebook end-user-agreement, the company stores data for as long as it is necessary, which might not be as long as users want. This sadly means that users lose data ownership. Worse still, the servers are located in different locations, in and out of the United States, subjecting data to different laws.

According to Dan Gray, as discussed in ‘Data Ownership In The Cloud,’ the actual ownership of data in the cloud may be dependent on the nature of the data owned and where it was created. He states that there is data created by a user before uploading to the cloud, and data created on the cloud platform. He continues to say that data created prior to cloud upload may be subject to copyright laws depending on the provider, while that created on the platform could have complicated ownership.

In addition to cloud provider policies, certain Acts of Congress, although created to enhance data security and still uphold the nation’s security, have shown how data ownership issues affect businesses. Two of these, the Stored Communications Act (SCA) and the Patriot Act show the challenges of cloud data ownership and privacy issues, with regards to government access to information stored in the cloud.

The Stored Communications Act (SCA)

Usually, when data resides in a cloud provider’s infrastructure, user owner rights cannot be guaranteed. And even when users are assured that they own their data, it does not necessarily mean that the information stored there is private. For example, the United States law, through the Stored Communications Act (SCA), gives the government the right to seize data stored by an American company even if it is hosted elsewhere. The interpretation of this saw Microsoft and other technology giants take the government to court, claiming that it was illegal to use the SCA to obtain a search warrant to peruse and seize data stored beyond the territorial boundaries of the United States.

Microsoft suffered a blow when a district court judge in New York ruled that the U.S government search powers extend to data stored in foreign servers. Fortunately, these companies got a reprieve mid-2016, when the Second Circuit ruled that a federal court may not issue a criminal warrant to order a U.S cloud provider to produce data held in servers in Ireland.  It is however, important to note that this ruling only focused on whether Congress intended for the SCA to apply to data held beyond U.S.A territory, and did not touch on issues to deal with Irish data privacy law.

The Patriot Act

The Patriot Act was put into place in 2001 as an effort by George Bush government to fight terrorism. This act allowed the Federal Bureau of Investigation (FBI) to search telephone, e-mail, and financial records without a court order, as well as expanded law enforcement agencies access to business records, among other provisions. Although many provisions of this Act were set to sunset 4 years later, the contrary happened. Fast-tracking to 2011, President Barrack Obama signed a 4-year extension of 3 key provisions in the Act, which expanded the discovery mechanisms law enforcement would use to gain third-party access. This progress brought about international uproar especially from the European Union, causing the Obama administration to hold a press conference to quell these concerns.

The situation was aggravated when a Microsoft UK director admitted that the Patriot Act could access EU based data, further disclosing that no cloud service was safe from the ACT, and the company could be forced to hand over data to the U.S government. While these provisions expired on June 1 2015, due to lack of congressional approval to renew, the government found a way to renew them through the USA freedom Act.

The two Acts show us that data owned in the cloud, especially public cloud, is usually owned by the cloud providers. This is why we are seeing the laws asking cloud providers to provide this information, and not cloud users.

What To Do In Light Of These Regulations

Even if the SCA has been ruled illegal as not to be used to get warrants to retrieve data stored in the cloud, and the USA freedom Act is purported by some parties as a better version of the Patriot Act, we cannot ignore the need for cloud users to find a way to avoid such compulsions.

One idea users could have is escaping the grasp of these laws, which is unfortunately impractical. To completely outrun the government, you would have to make sure that neither you nor the cloud service used has operations in the United States. This is a great disadvantage because most globally competitive cloud providers are within the United States jurisdiction. Even when you are lucky and find a suitable cloud provider, it is may still be subject to a Mutual Legal Assistance Treaty (MLAT) request. Simply, put, there is no easy way out.

Instead, understand the risks and let your clients know. For example, if the Patriot Act extension attempts were successful, financial institutions would be obliged to share information with law enforcement agencies on suspicion of terrorist activities. In such a case, a good financial institution would warn its clients of these risks before hand. Alternatively, you can find a way of storing data in-house, forcing the feds to go through you and not the cloud provider.

Conclusion                                                       

Truthfully, data ownership in the cloud is a complicated issue.  Determined by both government and company policies, data ownership in the cloud is not always retained.  Gladly, depending on data policies and how they categorize data in the cloud, a user could be granted full ownership. In the event that this doesn’t happen, prepare for instances of third-party access and infringement of complete privacy, hence rethink your business strategy. In short, as a cloud services client, please pay attention to the contract that you sign with your provider and understand the laws under which the provider operates.

Author: Davis Porter

HIPAA Compliant File Sharing Requirements

HIPAA Complaint File Sharing

 

In this article, let us explore a bit about origin of HIPAA privacy & security rules and its major parts such as – who are covered, what information is protected, and what safeguards need to be in place to protect electronic health information stored in the cloud, mainly in the context of HIPAA complaint file sharing.

Introduction

HIPAA (Health Insurance Portability & Accountability Act of 1996) enforced the Secretary of the US HHS to develop regulations that protect the security and privacy of health information that is stored in the cloud. In accordance, HHS published the HIPAA privacy and security rule.

  • The Privacy Rule establishes countrywide standards for protection of cloud information.
  • The Security Rule establishes set security standards to protect electronic information.

The Security Rule puts in motion the protections from the Privacy Rule, and addresses technical as well as non-technical safeguards which the organizations need to have in place for securing e-PHI.

Before HIPAA, there were no accepted security standards or requirements to protect cloud information. New technologies kept evolving, and the industry started moving away from paper and began relying on electronic systems more for paying claims, proving eligibility, providing and sharing information, etc.

Today, providers use clinical applications like CPOE systems, EHR, pharmacy, radiology, etc. Health plans provide access to care and claim management and self-service applications. This may mean that the workforce is more efficient and mobile, but the potential security risk also increases at the same time.

One of the main goals of this rule is to protect individual privacy with regard to cloud information while entities are allowed to adopt new technology to improve the efficiency and quality of patient care. The security rule is scalable and flexible which means covered entities can implement procedures, policies, technologies, etc. which are appropriate for their size and organizations structure.

Coverage

This rule, just like all administrative rules, applies to health care, health plans, clearinghouses and any health care providers who transmit health information electronically.

What’s protected?

This rule protects individually identifiable information known as PHI (Protected Health Information). It protects the subset of all information covered in the privacy rule which is all of the individually identifiable information created, received, maintained or transmitted electronically by an entity. It doesn’t apply to PHI which is transmitted in writing or orally.
On related note, here is a good article on What is PII and PHI? Why is it Important?

General Rules

The rule requires all covered entities to maintain an appropriate and reasonable technical, physical and administrative safeguard for e-PHI. Covered entities must:

  • Ensure confidentiality, availability, and integrity of e-PHI created, received, maintained or transmitted by them.
  • Identify and even protect against anticipated threats to integrity or security of information.
  • Protect against impermissible, anticipated disclosures or uses.
  • Ensure workforce compliance.

Risk Management and Analysis

The provisions in the rules need entities to conduct risk analysis as a part of security management. The management provisions and risk analysis of this rule are separately addressed here, since determining which security measures are appropriate for an entities shapes the safeguard implementation for the rule.

Administrative Safeguards

  • Security Personnel: Covered entities have to designate security officials who are responsible for implementing and developing security procedures and policies.
  • Security Management Process: Covered entities need to identify and analyze any potential risks to e-PHI. They must implement security measures which will reduce the vulnerabilities and risks to appropriate and reasonable levels.
  • Information Access Management: The security rule tells covered entities to implement procedures and policies that authorize access to e-PHI only at appropriate times depending on the role of the user or recipient.
  • Workforce Management and Training: Covered entities need to provide for appropriate supervision and authorization of the workforce who use e-PHI. Covered entities need to train all members of the workforce about procedures and policies for security and need to have and apply relevant sanctions against any members who violate procedures and policies.
  • Evaluation: Covered entities need to perform periodic assessments on how well security procedures and policies are meeting the requirements of this rule.

Physical safeguards

  • Faculty Control and Access: Covered entities need to limit physical access to facilities and ensure only authorized access is granted.
  • Device and Workstation Security: Covered entities need to implement procedures and policies which specify the correct use of electronic media and workstations. They must also have procedures and policies in place for the removal, disposal, transfer, and reuse of media.

Technical Safeguards

  • Access Control: Covered entities need to implement technical procedures and policies for allowing authorized person’s access e-PHI.
  • Audit Controls: Covered entities need to implement software, hardware and procedural mechanisms for examining and recording access and any other activity in information systems which use or contain e-PHI.
  • Integrity Controls: Covered entities need to implement procedures and policies which ensure e-PHI isn’t improperly destroyed or altered. Electronic measures have to be in place to confirm this as well.
  • Transmission Security: Covered entities need to implement security measures which protect against unsanctioned access to e-PHI which is being transmitted over electronic networks.

Organizational Requirements

  • Business Associate Contracts: HHS develops regulations related to associate obligations and contracts under HITECH Act, 2009.
  • Covered Entity Responsibilities: If covered entities know of activities or practices of associates which constitute violation or breach of their obligation, the entity needs to take reasonable steps to end the violation and fix the breach.

Procedures, Policies and Documentation Requirements

Covered entities need to adopt appropriate and reasonable procedures and policies for complying with provisions of the rule. They must maintain, until six years after the date of creation or last effective date, written procedures, policies, and records, of required activities, actions and assessments.

Updates: Covered entities need to periodically update documentation as a response to organizational or environmental changes which affect the security of e-PHI.

Noncompliance Penalties and Enforcement

Compliance: The rule establishes a set of standards for confidentiality, availability, and integrity of e-PHI. The HHS and OCR are responsible for enforcing and administering standards, in connection with their enforcement of the Privacy Rule and might even conduct investigations into complaints and reviews for compliance.

 

Author: Rahul Sharma

Alternative to Gartner’s Magic Quadrant Leaders – Syncplicity, Box, Citrix, Accellion

Gartner recently released their 2015 Magic Quadrant for Enterprise File Synchronization and Sharing(EFSS). The criteria for a vendor to be included in the list are Revenue, Geography, Commercial Availability, Total Users, Largest Deployment, References and Product Capabilities.

Most of the criteria seemed to favor companies backed by venture capitalists or larger enterprises. Other than Product Capabilities and the Cost of the solution, rest of the criteria is not really matter for a company looking to find an EFSS solution for their needs. This blog will cover how FileCloud is a perfect alternative to Gartner’s Magic Quadrant Leaders in Enterprise File Synchronization and Sharing (EFSS) – Syncplicity, Box, Citrix, Accellion taking into account Gartner’s Product Capabilities inclusion criterion.

Gartner’s Magic Quadrant Leaders in Enterprise File Synchronization and Sharing(EFSS)

 FileCloud   Syncplicity   Box   Citrix ShareFile  Accellion 
File Synchronization
File Sharing
File Access
Content Manipulation
Mobile OS Diversity
PCs
Security
Management – AD, LDAP, MDM
Integration
Delivery Model
File Transfer
Collaboration
Security – SAML, DRM
Secure Deployment
Management – Group Policy, Data Migration
Data Governance
Certifications, Compliance & Audit
Pricing for 100 users/year $4199 $15000 $35000 $22480 $29000
Detailed Comparison FileCloud vs Syncplicity FileCloud vs Box FileCloud vs Citrix FileCloud vs Accellion

Lets take a closer look on how FileCloud actually provides an advantage to the above listed products.

File Synchronization

FileCloud provides effortless File synchronization across computers running Windows, Mac, Linux and even Netgear ReadyNAS NAS devices. Moreover, FileCloud is one of the few products that enable real-time synchronization of files stored in network folders (CIFS, NFS).

File Sharing

With FileCloud one can create a public or a fully private share to folder or files. Changes to shared folders like uploads or file changes are sent out as notifications via email to everyone connected to the share. Activity Streams for every folder allows users to review and view all changes that happened to a folder or a file over a period of time. Additional powerful features includes ability to limit the maximum number of downloads for a shared file, automatically expire a share after a certain time, anonymous file uploads, prevent downloads, integrated file uploads  widget and more.

File Access

FileCloud offers multiple ways to access your organization’s files securely: Web access, Desktop Sync, Virtual Drive, Mobile Apps, & WebDAV.

Content Manipulation

With FileCloud one can easily view documents using the built-in Document Preview. In addition, one can edit documents from the web browser with a Lock (check-out) feature that will prevent others from editing the files at the same time.

Mobile OS Diversity

FileCloud’s highly rated apps are available for Apple iPhone, Apple iPad, Android Phones, Android Tablets, Windows Phone 8, Blackberry and FileCloud Metro Windows 8 App. And, administrators have the tool to disable any mobile device from accessing FileCloud.

PCs

FileCloud provides an array of client apps/plugins such as Drive, Sync & Outlook Plugin. The Sync app works on Windows, Mac and Linux. With FileCloud’s Outlook Plugin one can easily provide links to FileCloud files and folders.

Security

FileCloud’s security features include Encryption At Rest & Transit, Customer Managed Encryption keys, Two-Factor Authentication, and Anti-Virus Integration scanning. In addition, FileCloud supports SAML (Security Assertion Markup Language) based web browser Single Sign On (SSO) service that provides full control over the authorization and authentication of hosted user accounts that can access FileCloud Web based interface.

Management – AD, LDAP, MDM

FileCloud was primarily built for the purpose of easily integrating with existing storage and authentication systems. FileCloud allows your existing Active Directory users to use FileCloud using their existing credentials. In addition, provides support for multiple Active Directory servers, support for Mixed Domain Active Directory (Hosted AD).

With FileCloud, one doesn’t need to buy a separate MDM(Mobile Device Management) application. FileCloud has built-in features with which administrators can block any mobile device from access and do a remote wipe for absolute security & control.

Integration

FileCloud offers strong integration with popular productivity apps from Microsoft (Word, Excel, Powerpoint, Apple Keynote and others). FileCloud’s comprehensive API enables seamless integration with Enterprise e-discovery, DRM and Analytics platforms.

Delivery Model

In addition to providing an on-premise solution, FileCloud provides  ready made images for easy installation on Amazon Web Services and Microsoft Azure. FileCloud Public AMI (Amazon Machine Image) is currently available in Amazon AWS Marketplace. Using FileCloud’s AMI and Azure VM, one can host their own file share sync and mobile access solution for their organization is less than 10 minutes.

Collaboration

The built-in FileCloud features such as unlimited versioning, recycle bin support, lock while editing, ability to edit document over the browser provides a powerful set of tools for collaboration.

Security – SAML, DRM

FileCloud offers a SAML-based Single Sign-On (SSO) service that provides customer with full control over the authorization and authentication of hosted user accounts. FileCloud administrator tools include features such as auditing, detailed logging of activities, and ability to disable access or feature on mobile apps. With ease administrators can set up policies with respect to user access, password setup and client applications usage.

Secure Deployment

FileCloud can be securely deployed on Enterprise private clouds and on-premise servers. It can also be hosted on public IaaS providers like AWS and Azure. Compared to other vendors listed in Gartner EFSS magic quadrant, FileCloud offers a 100% private cloud deployment without any connection to our infrastructure. One can also run FileCloud in an internal network without exposing to public internet.

Management – Group Policy, Data Migration

FileCloud provides array of features with respect to group policy enforcement, data migrations, backup and so on. FileCloud is bundled with necessary tool to perform full backups of your cloud installation that includes both files and database.Some of the features of these backup scripts:

  • can be run at anytime manually from command line
  • can be part of an automated system like cron job.
  • can be run on a live cloud installation .
  • can backup to local or remote Linux targets.

With FileCloud Administrators can easily create groups and classify their users. With AD integration and its sync functionality to integrate AD groups to FileCloud specific groups it becomes a lot easier to maintain access control.

Data Governance

FileCloud provides powerful features such as content search, ability to run audit reports and tools to extract critical data from the audit logs. The extensive Auditing support make sure every operation in FileCloud is logged into the Audit logs to help meet some of toughest compliance such as HIPAA.

Certifications, Compliance & Audit

The below table lists features that makes FileCloud stand apart even from 2015 Gartner’s Magic Quadrant Leaders.

 FileCloud   Syncplicity   Box   Citrix ShareFile  Accellion 
On Premise/Self-Hosted
IAAS (AWS)
Multi-Tenancy
Mobile OS Compatibility All All All Not All Not All
iOS, Android Media backup
Integrate – Existing Home Directories
NTFS Support
Network Share Support Buy
Network Share Versioning
Active Directory Support Buy
Multiple Active Directory Buy
Single Sign-On (NTLM/SSO) Buy
Data At Rest Encryption
File Lockingt
Outlook Integration
API Support Buy
Amazon S3/OpenStack Support
Mobile Device Management (Block devices, remote wipe, notification to devices) Limited Buy Buy Limited

Conclusion

With FileCloud, enterprises get one simple solution with all the needed features ready to be installed. Moreover, for a 100 user package, FileCloud costs $4199/year, almost 1/5th to 1/10th the price of Gartner’s 2015 Magic Quadrant Leaders – Syncplicity, Box, Citrix, Accellion

Try FileCloud For Free & Receive 5% Discount

Take a tour of FileCloud

Demystifying the Complexities of Data Loss Prevention

Data loss prevention can be defined as the strategies to prevent employees from accidentally or unknowingly sending sensitive information outside the corporate network. In an increasingly connected world, there are multiple ways for confidential data to leak outside the confines of the enterprise. Gone are the days when communication within an office was limited to hardcopy, phone or fax. Preventing sensitive information from leaving an organization has always been a major problem but the proliferation of online communication channels and mobile devices has made it easier for data loss to occur, either maliciously or accidentally.

While some incidents are caused by external threats (hackers), others occur because internal users carelessly trusted third parties with sensitive information. Organizations across industry verticals all over the globe have experienced their critical data being stolen, leaked or lost to the outside world. Aside from insider threats, DLP is also being driven by rigorous privacy laws, most of which have strict data access or protection components.

The threat mostly comes from within

Despite the multiple security procedures, policies and tools put in place by enterprise IT, employees still engage in risky behaviors that endanger both corporate and personal data. Business networks have become key components of communication, collaboration and data access. Organizations are integrating business operations with network communications in order to boost the productivity of their workforce. Aside from putting more data at risk, businesses today are likely to suffer greater consequences if their data is compromised or lost. Loss of intellectual property such as financial data, product blueprints and merger plans, can not only damage a company’s reputation and brand image, but also result in direct or indirect damage in the tune of millions of dollars.

In order to mitigate insider threats to data loss, tech savvy companies train employees on the risks associated with data loss after instituting strict security policies. However, the effectiveness of these actions remains questionable. The best way to curb data leakage is to understand how employee behavior increases risks and take further steps to foster a security-conscious corporate culture where employees hew to the established procedures and policies.

The tools required to mitigate data loss

The enterprise faces multiple security threats on a daily basis, and although the technology is not habitually deployed as firewalls, DLP is without doubt a key security control against the threats faced by the enterprise. There is a general lack of concurrence among IT professionals as to what constitutes a DLP solution. While others limit it to complete product suites, others also consider USB port control or encryption. Research and advisory firm Gartner, defines DLP solutions as:

 

Technologies that as a core function, perform content inspection of data at rest or in motion, and can execute response – ranging from simple notification to active blocking – based on policy settings.

 

From the above definition, in its simplest form a data loss prevention solution must be able to:

  • Perform Deep content analysis on data in motion, at rest and in use
  • Offer central policy management
  • Invoke policy enforcement on sensitive content

DLP solutions utilize contextual analysis and content awareness to ensure end users don’t maliciously or accidentally share data whose disclosure may put an organization at risk. DLP suites typically rely on file watermarks, regular expression-based string matching, fingerprint analysis, meta-data matching and storage point/type based logic to pinpoint critical and confidential data and what the established enforcement of the configured policy is. DLP prevents data from leaking to external drives, unauthorized emailing of confidential information and unauthorized upload cut/paste of critical corporate data to external sites among others.

DLP vendors mainly aim to address endpoint, email and network security; however, the environment DLP seeks to protect is undergoing radical changes as cloud and mobile technology become integral parts of the enterprise. This gives rise to the need of additional DLP features such as mobile security suites as well as web and email security gateways. Below are some of the top DLP vendors.

DLP logo

Symantec

Symantec is a popular player in the security market and remains one of its most reliable vendors. In its most recent release, Symantec extended DLP to cloud email and storage in order to provide clients with the visibility and control required to secure their critical data while fully utilizing the cloud. Its DLP solution has three main modules: Storage DLP, Network DLP and Endpoint DLP. The solution is capable of monitoring and preventing users from syncing corporate data from their computers to personal cloud services such as Dropbox, Google Drive, and Microsoft OneDrive.

Symantec’s DLP solution is highly favored for its ease of installation, configuration and administration. The DLP team also offers vendor service and support and works with clients to share insights into the best practices for data security. Its Data Insight solution tackles the problem of unstructured files by allowing administrators to view usage patterns and access permissions for unstructured data.

McAfee

The McAfee Data Loss Prevention solution is available via a series of virtual and physical appliances that facilitate the various DLP capabilities. There are appliances for data discovery and general DLP management of data copied to external storage. The Discover appliance is capable of spotting confidential data in an enterprise setting and can apply the configured policies to data both in transit and at rest. Discover also scans specific repositories and network resources for violations. The Prevent appliance can lock down data that is not being transmitted through an approved method.

McAfee’s DLP product has features that are specifically geared towards emerging platforms like mobile devices and social media. Clients can utilize pre-built policies for compliance regulations such as HIPAA.

Websense

The Websense data security suite includes a data security gateway, a tool for classifying and locating data across network infrastructure and Data Endpoint, which spots and controls data being used in PC’s, USB drives and other endpoint devices. The product suite is also capable of handling mobile endpoint data protection via Triton Mobile security. This cloud-based solution is offered via VPN, any traffic that goes through registered devices (company owned or BYOD) is routed through the VPN allowing Triton to block access to specific apps and websites. Triton also provides full email DLP protection.

Other players in the DLP space incude: RSA (Security division of EMC), CA Technologies, Verdasys, Trustwave, Code Green Networks, Palisade Systems, InfoWatch and GTB Technologies

Finding the Right Fit

The DLP market is evolving to meet enterprise requirements for monitoring and classifying sensitive data, wherever is used or stored, off and on corporate networks. As the market approaches maturity, products are becoming more stable. The selection of a DLP vendor likely depends on other considerations aside from feature-by-feature comparisons. Factors like vendor strength, market share, reputation and total cost of ownership must also be put into consideration. Other considerations include: accuracy and performance, ease of use, integration and scalability.

In Closing

The data of an organization can be considered its lifeblood; all its digital assets should therefore be secured. The unintentional or intentional release of confidential data from endpoints within the enterprise is a serious problem. Aside from adhering to the best practices for data loss prevention, organizations should also invest in DLP technologies.

Alternative to Novell Filr – Why FileCloud is better for Business File Sharing?

FileCloudVsNovellFilr

FileCloud competes with Novell Filr for business in the Enterprise File Sync and Share space(EFSS). Before we get into the details, I believe an ideal EFSS system should work across all the popular desktop OSes (Windows, Mac and Linux) and offer native mobile applications for iOS, Android, Blackberry and Windows Phone. In addition, the system should offer allf the basics expected out of EFSS: Unlimited File Versioning, Remote Wipe, Audit Logs, Desktop Sync Client, Desktop Map Drive and User Management.

The feature comparisons are as follows:

Features sharefile
On Premise
File Sharing
Access and Monitoring Controls
Secure Access
Document Preview
Document Edit
Outlook Integration
Role Based Administration
Data Loss Prevention
Web DAV
Endpoint Backup
Amazon S3/OpenStack Support
Public File Sharing
Customization, Branding Limited
SAML Integration Under Development
Anti-Virus
NTFS Support
Active Directory/LDAP Support
Multi-Tenancy
API Support
Application Integration via API
Large File Support
Network Share Support
Mobile Device Management
Desktop Sync Windows, Mac, Linux Windows, Mac
Mobile OS Compatibility iOS, Android, Windows Phone iOS, Android, Windows Phone
Pricing for 100 users/ year $3000 $4500

From outside looking-in, the offerings all look similar. However, the approach to the solution is completely different in satisfying enterprises primary need of easy access to their files without compromising privacy, security and control. The fundamental areas of difference are as follows:

Feature benefits of FileCloud over Novell Filr

Embedded File Upload Website Form – FileCloud’s Embedded File Upload Website Form enables users to embed a small FileCloud interface onto any website, blog, social networking service, intranet, or any public URL that supports HTML embed code. Using the Embedded File Upload Website Form, you can easily allow file uploads to a specific folder within your account. This feature is similar to File Drop Box that allows your customers or associates to send any type of file without requiring them to log in or to create an account.

Document Quick Edit – FileCloud’s Quick Edit feature supports extensive edits of files such as Microsoft® Word, Excel®, Publisher®, Project® and PowerPoint® — right from your Desktop. It’s as simple as selecting a document to edit from FileCloud Web UI, edit the document using Microsoft Office, save and let FileCloud take care of other uninteresting details in the background such as uploading the new version to FileCloud, sync, send notifications, share updates etc.

Unified Device Management Console – FileCloud’s unified device management console provides simplified access to managing mobile devices enabled to access enterprise data, irrespective of whether the device is enterprise owned, employee owned, mobile platform or device type. Manage and control of thousands of iOS and Android, devices in FileCloud’s secure, browser-based dashboard. FileCloud’s administrator console is intuitive and requires no training or dedicated staff. FileCloud’s MDM works on any vendor’s network — even if the managed devices are on the road, at a café, or used at home.

Device Commands and Messaging – Ability to send on-demand messages to any device connecting to FileCloud, provides administrators a powerful tool to interact with the enterprise workforce. Any information on security threats or access violations can be easily conveyed to the mobile users. And, above all messages are without any SMS cost.

Multi-Tenancy Support – The multi-tenancy feature allows Managed Service Providers(MSP) serve multiple customers using single instance of FileCloud. The key value proposition of FileCloud multi-tenant architecture is that while providing multi-tenancy the data separation among different tenants is also maintained . Moreover, every tenant has the flexibility for customized branding. MSPs who are interested in becoming FileCloud partners click here

Customization & Branding – FileCloud can be customized extensively to reflect their brand. Some of the customizations include Logos, Labels, Email Templates, UI Messages and Terms Of service. However, Accellion’s kiteworks customization is very limited to header images displayed on the login and registration pages.

Amazon S3/OpenStack Support: Enterprise wanting to use Amazon S3 or OpenStack storage can easily set it up with FileCloud. This feature not only provides enterprise with flexibility to switch storage but also make switch very easily.

Conclusion

Based on our experience, enterprises that look for an EFSS solution want 3 main things. One, easy integration to their existing storage system without any disruption to access permissions or network home folders. Two, ability to easily expand integration into highly available storage systems such as OpenStack or Amazon S3. Three, ability to truly customize their self-hosted EFSS solution with their company branding.

Novel Filr neither provides OpenStack/Amazon S3 storage integration support nor extensive customization/branding capability. On the other hand, FileCloud provides easy integration support into Amazon S3/OpenStack and extensive customization/branding capabilities.

Here’s a comprehensive comparison that shows why FileCloud stands out as the best EFSS solution.

Try FileCloud For Free & Receive 5% Discount

Take a tour of FileCloud

Top Cloud Security Trends for Government Organizations

goverment security trend

According to a report by the Rand Corporation, the cyber black market is progressively growing- hackers are now more collaborative than ever and consistently use sophisticated strategies to target and infiltrate data centers. In the past, they were driven by sheer notoriety and malice to attack data centers and ultimately prove their maneuver skills to their peers. Unfortunately, the trend gradually changed, and hackers are now driven by warfare agendas and the increasingly developing black market, where they sell valuable information to the highest bidders.

Of course their biggest preys are government data centers, which are particularly targeted by cyber armies with agendas against their respective target nations. In fact, governments now face more potentially damaging risks from cyber warfare than the regular type of engagement- In the former, a single individual with just a computer could successfully launch an attack against major government cloud databases, cripple them, and cause significant socio-economic damages. One of the most recent attacks was directed at Iran’s nuclear centrifuges, where the attackers used the simple “Stuxnet” virus to harm more than 20% of their installations. Under the cover of different agendas, an Iranian hacking group also recently went on a cyber-attacking spree dubbed “Operation Cleaver”, which ultimately damaged essential government infrastructure in more than 16 countries.

According to experts, this is only the beginning. Through a research conducted by the Pew Research center, 61% of them believed that a well-planned large-scale cyber-attack will be successfully orchestrated before 2025, and consequently severely harm the nation’s security. With such threats looming, it is essential for the government to implement the most efficient developing security technologies into their cloud. Some of the current top trends include:

Improved Access Control

Many of the successful attacks sail through because of poor access controls in the targeted data centers. Although not a government corporation, Sony’s recent problems, which even drove the government to intercept, were caused largely due to careless password and username usage. To prevent such attacks, the government organizations are now opting for advanced authentication processes to access their cloud resources. In addition to the standard two-factor authentication which grants access after verifying IP and username, the organizations are now implementing biometrics and secondary devices verification in their access control architecture.

Sophisticated Encryption

To make data useless to hackers when they infiltrate data centers or spoof during transmission, government organizations have been encrypting their data. Unfortunately, this has proven ineffective to hackers who steal decryption keys or use sophisticated decryption algorithms to unfold and obtain data. To prevent future recurrences, government organizations are stepping up their data-at-rest and data-in-transit encryption systems.

Through the years, they have been using two factor encryption systems where cloud servers and endpoint user hold the encryption keys. This is gradually changing thanks to automated encryption control which get rid of the user factor in the equation. Instead of distributing encryption keys to the individual users, the systems use array-based encryption which fragments the data during storage and transmission. The meaningless fragments are transmitted individually and can only be fully defragmented into meaningful data if the server or endpoint device detects all the fragments. Therefore, hackers can only spoof on meaningless data fragments.

Digital Risk Officers

According the Gartner Security and Risk Management Summit of 2014, the year 2015 will see a proliferation of digital risk officers. In addition to tech officers, enterprises and government organizations will now hire digital risk officers to critically assess potential risks and strategize on cloud and data security.

This has been necessitated by continued expansion of the government digital footprint, whereby its organizations are now widely integrating their systems with employee BYOD to improve service delivery. As the network and infrastructure grows, so do the risks- which now require dedicated experts to prevent them from developing into successful attacks. With the trend only picking up in 2015, Gartner predicts it to exponentially grow over the next few years depending on the expanding networks of various organizations. In 2017, the adoption of DROs by government organizations is expected to be at 33%.

Multi-Layered Security Framework

Since the cloud systems are composed of various elements which face different threats, government organizations are protecting their data through tiered multi-layered security frameworks. For a hacker to gain access to any of the systems, he has to first go through a sophisticated security model composed of several detection, resistance, defense and tracking layers.

In addition to the network firewalls, the government is using virus-detection and anti-spyware on its servers and storage systems to comprehensively protect server operating systems, endpoint devices, file systems and databases and applications.

Big Data Security Analytics

“Without big data analytics, companies are blind and deaf, wandering out onto the web like deer on a freeway”- Geoffrey Moore, author of Crossing the Chasm, indicated as he emphasized the need of implementing big data analytics in all the relevant departments in an organization, especially the web.

Government organizations are directly adopting this principle by embedding critical big data security analytics in their security frameworks. This allows them to continuously monitor data movement, exchange and potential vulnerabilities which hackers and malware could capitalize on. Additionally, data that is generated is comprehensively analyzed to gather intelligence on internal and external threats, data exchange patterns, and deviations from normal data handling. Due to its efficacy in analyzing potential threats and blocking them, Gartner predicts that 40% of organizations (both government and non-government) will establish such systems in the next five years.

Although no strategy is regarded absolute and perfect, these current trends are expected to streamline the cloud sector and offer government organizations increased security compared to previous years. This, with time, is expected to significantly reduce the number of successful attacks orchestrated on governmental cloud resources.

 Author: Davis Porter
Image Courtesy: Stuart Miles, Freedigitalphotos.net