Archive for the ‘Cloud Computing’ Category

Top 10 Cloud Security Threats In 2018 And How To Avert Them


2017 has seen a plague of cyber-attacks- from ransomware shutting down hospitals in Europe, to Equifax data breach, and malware targeting established brands like FedEx. By mid-year alone, the number of attacks in the U.S. had risen by 29% compared to the same time in the previous year. According to the Identity Theft and Resource Source, the organization that had tracked them, more attacks were expected at a growth rate of 37% per year.

Sadly, they were right. As a matter of fact, their prediction turned out to be barely an underestimation. By the end of the year, they had recorded a drastic upturn of 44.7% growth rate compared to 2016. Undoubtedly an all-time high.

If you assume that that must have been the hardest 12 months for cybersecurity, wait until we are done with 2018. According to the Information Security Forum (ISF), the data security organization that had predicted an increase in the number of data breaches in 2017, 2018 will be another painfully dire year. The number and impact of security attacks are expected to rise again over the next couple of months.

The year is also expected to be very thrilling for cloud computing, as more enterprises continue expanding their computing frameworks to the cloud. As a result, the volume of sensitive data in cloud servers is expected to expand at an exponential rate. And that translates to increased vulnerabilities and targets for cyber attackers.

But contrary to popular belief, method and scale of attack will not be changing drastically any time soon. IT professionals are already aware of 99% of the vulnerabilities that will continue to be exploited through to 2020.

So to help you tighten your defenses in the cloud, here are the top 10 threats we expect through 2018.


  1. Data Leak

The average cost of a data breach, going by figures published by Ponemon Institute, currently stands at $3.62 million. Hackers continue to target cloud servers they think have valuable information they could use. And unfortunately, many of them might turn out to be lucky due to vulnerabilities even as simple as private data shared on public domains.

In addition to defining and implementing strict data policies, organizations should invest in data security tech like firewalls plus network management solutions. Most importantly, they should only leverage proven cloud solutions with state-of-the-art security features.


  1. Data Loss

A data leak might be unfortunate, but not as much as data loss. While the former mostly occurs when your cloud server is successfully infiltrated, the latter is mostly caused by natural and artificial disasters. When you think you have all your enterprise data intact, it vanishes completely after physical destruction of the servers.

It’s difficult to predict natural disasters. So, to avoid going out of business due to data loss, implement a multi-layered backup system that consistently runs in real time.


  1. Insider Attacks

Netwrix conducted an IT Risks Survey and established that many enterprises are still experiencing difficulty gaining comprehensive visibility into their IT systems. They consequently remain vulnerable to data security threats emanating from both authorized and unauthorized users. Such an attack could be potentially detrimental since users can easily access even the most sensitive information.

Organizations should, therefore, implement strict user policies, plus effective administrative measures to track and maintain visibility to all user activities.


  1. Crime-as-a-Service

Cybercrime has developed to a level that malicious individuals can now hire hackers to target organizations. The ISF predicts an escalation of this in 2018, as hackers continue to access infiltration tools through the web, and criminal organizations develop complex hierarchies.

Since this mostly targets intellectual property and trade secrets, enterprises should encrypt data both at rest and during transmission.


  1. Human Error

The human factor continues to be the weakest element in cloud security. Your organization’s cloud users might, for instance, mistakenly share that extremely sensitive information you’ve been trying to secure from hackers. Unfortunately, this risk multiplies with every user added to the network.

In addition to strict user privilege management, organizations should invest in IT training to teach employees on cloud use, potential threats, and data handling.


  1. AI Weaponization

Researchers and information securities have been leveraging neural networks, machine-learning strategies, and other artificial intelligence tools to assess attacks and develop corresponding data security models. The downside to this is the fact that hackers will also use the same tools to analyze cloud vulnerabilities, and launch systematic attacks.

Since this threat is increasingly dynamic, it requires an equally multilayered system of data security strategies to prevent attacks from multiple vantage points.


  1. IoT Challenge

Enterprises are exceedingly capitalizing on the cloud to facilitate remote file sharing and access. But this introduces the threat of BYOD devices, which could serve as entry points for malware.

CIOs should, therefore, prioritize not only on server security but also device security. All devices allowed to access enterprise networks should be thoroughly scanned, and adequately tracked.


  1. Account Hijacking

If perpetrators figure out user credentials, they could easily gain access to the corresponding cloud account, hijack it, then manipulate data, eavesdrop on ongoing activities, and tamper with business processes.

In addition to closely protecting user credentials, accounts should come with multi-factor authentication, and the ability to regain control in the event of a hijack.


  1. Denial Of Service

By forcing cloud services to consume an excessive amount of system resources like network bandwidth, disk space, or processor, attackers continue to clock out legitimate users from server access.

An adequately updated antivirus and infiltration detection system should be able to pick up such an attempt, while a firewall will block off subsequent data transfer.


  1. Insecure APIs

Cloud services continue to provide access to third-party software and APIs, which facilitate collaboration and improve service delivery. But some of these APIs come with vulnerabilities that hackers are able to take advantage of to access the primary data.

This requires CIOs to comprehensively review and vet all third-party services before proceeding with subscriptions.


All factors considered none of these aversion measures would be effective on a cloud service that’s poorly secured. So get in touch with us today to learn more about the world’s most secure Enterprise File Sharing Solution.



Author: Davis Porter

GDPR Presents Opportunities for MSPs

In today’s digital world, the issue of data privacy is provoking constant debates with large corporations and even governments being objurgated for invasions of privacy. According to online statistics firm Statista, only about a third of internet users in the United States are concerned about how their personal is data is shared. However, that number is likely to rise as privacy compliance becomes a ubiquitous business concern due to the growing number of regulations formulated to curb the unauthorized access and use of personally identifiable information. The GDPR is one such legislation. No other legislation measures up to the inherent global impact of the EU’s General Data Protection Regulation (GDPR).

Gartner’s prediction that more than half of companies governed globally by the GDPR will not be fully compliant by the end of 2018 has come to fruition. With less than a month to go, a survey of 400 companies conducted by CompTIA inferred that 52 percent were still assessing how GDPR applies to their business. The research also showed that only 13 percent were confident that they are fully compliant. GDPR will without a doubt be a disruptive force in the global marketplace that cannot be ignored. This presents prodigious business opportunities for MSPs to leverage their experience in network security offerings, class analytics solutions, and their own experiences implementing strategies around this new development.

1. An Opportunity to Become GDPR Compliant

As an MSP, it makes sense to protect your business from any reputational and financial consequences by becoming GDPR compliant. It is said that charity starts at home, it would therefore be incongruous for an MSP that is yet to achieve full GDPR compliance to offer guidance in the same aspect. The experiences you gain in your journey to compliance will be of great value to both current and potential customers.

2. An Opportunity to Engage and Educate Your Clients

Most non-European businesses are yet to establish whether the GDPR will apply to them. And for those that are aware, their MSP will likely be the first place they turn to for help; whether its to set up reporting tools, work on data encryption, conduct audits, or implement new data management practices. MSPs should ensure that their clients fully understand the extent and impact of the regulations, and prepare them for GDPR. Since they are already aware of their client’s internal practices and processes, managed service providers are better suited to architect solutions that incorporate GDPR compliance and governance.

MSPs will have to re-onboard clients to make sure their prescribed SaaS offering will meet GDPR requirements. Gather resources and links that can help educate your clients. The use of informative marketing campaigns, or a resource center on your site will help create channels for dialogue – which may subsequently lead to new business projects.

3. An Opportunity to Understand Your Clients Data

Data is a crucial asset, however, most MSPs know very little about the data their clients possess. The only way an MSP can offer guidance and services related to GDPR is by understanding what data your clients have and the location of said data. MSPs should be ready to make an extra effort beyond protecting business applications to protecting personal data. The only way to accomplish this is by analyzing your client’s existing data. Through this process, you will be able to identify any security gaps and create customized security offerings to fill them. Additionally, the data discovery will allow you to adjust your pricing accordingly and push your customers towards more secure technologies or sell additional services that mitigate the risks their current business systems present.

4. An Opportunity to Offer Compliance and Security Related Services

MSPs tend to act as virtual CIOs for their customers. In most cases, the line between packaged service and free consultation tends to get blurred somewhere along the line. GDPR guidance could easily follow the same track – unless the value you offer is presented as a bundle that can be allotted a price tag. Compliance and security services are a potential gold mine for service providers who have acquired the management expertise to satisfy and simplify the complexities associated with the General Data Protection Regulation. Since having a designated Data Protection Officer (DPO) is a mandatory requirement under GDPR regardless of the size of the company; MSPs can use that as an opportunity to establish a DPO as a service model geared towards SMEs that may lack the resources to recruit costly, in-house compliance staff.

5. An Opportunity to Expose Your Brand

Marketing a compliance culture with transparency builds greater relevance and trust among current and potential customers. Companies looking to achieve full GDPR compliance are likely to align themselves with a service provider that has a demonstrated track record. Publicly documenting your GDPR compliance milestones on blogs, social media and your website confirms your familiarity with the subject. Once achieved, full GDPR compliance will act as a quality standard that can be placed on marketing channels to attract and reassure prospective clients.

In Closing

As the weight of the General Data Protection Regulation continues to impact the globe, sagacious MSPs will have an opportunity to assist their customers prepare and gain incremental revenues while supporting the European Unions effort to create a digitally secure global marketplace. Despite the current rush to beat the May 25th deadline, compliance isn’t a one off activity. Companies will always have a budget for comprehensive strategies aimed at achieving and maintaining privacy compliance.

image curtesy of freepik



Author: Gabriel Lando

10 Data Storage and Backup Trends 2018



Only handful of things can match data storage when it comes to ripples in the tech industry. With every new gadget comes improved data storage. It’s essentially driving information processing because processors can only work on data held in their respective storage repositories.

Increased data storage has increasingly triggered new developments in data handling and management policies. In the past, being able to remotely access, sync and manipulate data was only a concept. Then came cloud technology and it revolutionized the whole business landscape. It has since grown from barely a boardroom suggestion in 2009 to a critical resource for the bulk of enterprises.

Currently, business and IT executives are shifting from perceiving cloud storage as just a tool. They are now leveraging it to achieve organization goals. And to effectually facilitate this, service providers have not only diversified but also holistically integrated PaaS and SaaS with IaaS.

These advancements in cloud tech have directly and indirectly influenced growth in other data storage technologies, as the volume and complexity of data increases at a steady rate. 2018 is expected to be quite fascinating since all industries are now beneficiaries of the resultant data storage trends.

Organizations are already warming up to this, as close to half of them will be increasing their IT budgets for the next 12 months.

So, what data storage and backup trends should we expect in 2018?

  1. Multi-cloud Storage

In 2016, a study by VMTurbo revealed that 57% of enterprises were yet to deploy a multi-cloud strategy. Their databases were essentially single-faceted, with organizations that had already migrated to the cloud leveraging either public or private cloud services.


Overall perception has substantially shifted since then, and organizations are now capitalizing on private cloud data storage for sensitive data while keeping some of their data in public cloud servers. 2018 is expected to experience a proliferation of this hybrid approach, as organizations also continue leveraging SaaS, PaaS and DraaS. This will see 70% of enterprises come on board by 2019.


  1. Software Defined Storage

Service providers are already integrating software-defined storage features in their data storage solutions, and we expect to see this trend picking up further through 2018.


SDS, as it’s popularly known, bridges the gap between current storage needs and legacy infrastructure. This fact alone, according to IDC, will see the market continue to develop at a rate of 13.5% from 2017, into 2018, all through to 2021. This translates to an approximate value of $16.2 by the time you are crossing over to 2022.



  1. Artificial Intelligence

Instead of implementing one-size-fits-all service providers are now using a more strategic approach to cater to varying data storage needs. System administrators will continue leveraging the power of artificial intelligence to align data to database capabilities, assess metadata across organization storage infrastructure, and refine management policies. With time, this will eventually lead to optimal performance and resource savings thanks to on-demand infrastructure use.


  1. Flash

Flash storage is another revolutionary technology that has drastically changed data storage. Its ripple effects will continue in 2018, as it becomes flashier thanks to SSD technology. The industry will see the production of even smaller flash drives with much larger storage capacities. And this will boost storage efficiency, speed, energy savings and overall performance.


The tech behind flash storage will also facilitate policy-based provisioning, storage automation, and integrated data protection.


  1. Cloud Spending

Increased adoption of the cloud means organizations will dig deeper into their pockets to acquire additional data storage resources. Typical enterprise spending is currently growing at an average rate of 16%, which is four and a half times the corresponding growth in IT spending previously witnessed in 2009. The next 12 months will experience a bump, as the rate increases to about 6 times the 2009 rate, which will proceed all through to 2020.


Compared to other resources, the cloud will take up about half the IT budget in a typical organization. The result is a market that will have grown to $390B by 2020.


  1. Data Intelligence

For long, data storage was only that. Holding data in repositories that only display file size and location. Enterprises were basically in the dark about what and how data is being used.


2018 is marking a revolutionary stage in data storage through increased adoption of metadata management solutions. System administrators can now leverage data intelligence in tracking their files, then subsequently view how, where and when data was accessed, modified, or changed. This is proving to be especially important in achieving full data control and equitable resource distribution.


  1. Data Legislation

As service providers grow, the data footprint is expanding to multiple locations worldwide with varying compliance levels and legislation. So far, we’ve seen the development of new laws like the European General Data Protection Regulation, and NIST SP 800-171 for American defense contractors.


Due to such trends, it’s becoming critically important for enterprises to understand exactly where their data is stored. This helps them establish the type of data to store, and how to manage it to avoid collisions with the law.


  1. NVMe

While flash will continue influencing data storage, the real impact will come from new developments triggered by the flash revolution. One of the most prominent ones in 2018 is Non-Volatile Memory express, otherwise commonly referred to as NVMe. It will continue being adopted as an alternative to SCI-based interfaces to capitalize on improved Solid State technologies.



  1. Hyperconverged Infrastructures (HCI)

As a driver for software-defined storage, Internet-of-Things will keep featuring prominently over the next 12 months. 48% of enterprises are already leveraging it, and 43% are planning to join the bandwagon in 2018.


If combined with data compiled from other systems, IoT could overwhelm data storage. So to solve this problem, organizations are taking advantage of HCI to put all their data in one place.


  1. Cloud Storage Capacity

Data centers will keep expanding in 2018 as service providers improve their features and overall cloud storage capacities. The subsequent data footprint will be further boosted with the emergence of new vendors eager to get a piece of the lucrative data storage industry. Consequently, competition is expected to increase, and place additional pressure on the industry players to further regulate the prices of their provisions.




Author: Davis Porter

Hybrid Cloud Risks That IT Managers Can’t Take Their Eyes Off

It’s been more than a decade since cloud computing brought business close to the idea of affordable computing, storage, and application resources. Soon enough, the cloud universe underwent a bifurcation, driven namely by the public and private cloud. As companies began to understand the pros and cons of both approaches to cloud computing, another term became the buzzword. It’s a hybrid cloud – an arrangement where enterprises create a good balance of public and private cloud.




Risks Associated With Hybrid Cloud

Whereas the hybrid cloud approach helps companies achieve the perfect balance of application availability, security, and the resultant employee productivity, it also presents certain management challenges. Most of these challenges/risks are directly or indirectly related to the KPIs of cloud-based applications, from a productivity, accessibility, and security standpoint. Here’s a guide to help you understand and manage these risks.


Too Many Decision Makers

When an enterprise makes a conscious choice to go for a hybrid arrangement of public, private, and on-premise systems, one of the biggest risks is of having too many decision makers influencing the choice of cloud tools. Particularly when high ranking end user and business team leaders push for the choice of certain cloud tools without letting IT analyze it on different aspects, chaos ensues.

When an enterprise is stuck with many cloud solutions that don’t exactly integrate with each other, there’s hardly a single team or person responsible for the hodgepodge that exists in the name of hybrid cloud. The performance management and overall coordination of such a cloud ecosystem often become too difficult to manage for the enterprise.


Cloud Security - In the Cloud


Underestimating the Data Stewardship Responsibilities

Here’s a fact – the level of control of data governance, security, and privacy that an enterprise’s IT team can exercise for an on-premise solution can’t be matched by that in a cloud-based solution. While working out the right mix of on-premise, public cloud, and private cloud solutions, enterprises must not lose sight of the level of control they need over their data. Even the leading cloud services vendors don’t take 100% responsibility for your enterprise data; there’s a lot that you need to be accountable for. And unless your hybrid cloud setup addresses this reality, there’s trouble brewing close by.


Improper Choices of Cloud Management Tool

To manage everything about your hybrid cloud infrastructure, you’ll need a sophisticated cloud management solution. Now, in a hybrid cloud environment, there is a lot of communication between public cloud and private cloud infrastructures. So, the tool your company purchases must be able to manage this communication while managing the security considerations alongside.

There are other issues too. If the public and private cloud vendors are different, the complexity increases. The solution – you either use your in-house IT or a vendor to do the interfacing or choose a tool that inherently supports a wide range of APIs from different cloud service vendors. For obvious reasons, a wrong choice of cloud management tool could mean a lot of problems in the long run. To manage this, some companies even go for cloud management tools for public and private cloud, from the same vendor. However, such a lock-in is inherently risky.



Mismanagement of Identity Management Solutions

Identification management tools are crucial components of enterprise IT security. When a company transitions to a hybrid cloud, invariably the identity management solution has to be extended from the private cloud to the public cloud components.

Because of this, there are some critical questions to be addressed related to identifying management.

Does the company choose different identity management tools for its private and public cloud components?

Does the company keep the same identity management tool?

If so, what are the risks of the public cloud vendor’s employees being able to use usernames and passwords to access information from the private component?

However, this is more of a risk assessment issue than anything else. As long as an enterprise remains conscious about this choice, the risks are manageable.


Lack of Understanding of Trust Requirements

All applications used in your business, along with all the peripheral tools used, have their corresponding trust requirements. These trust requirements are governed by the legal, regulatory, and contractual agreements your company has with your clients.

Enterprises can conveniently meet all these trust requirements by using private cloud solutions, wherein they have sufficient control over the nitty gritty of the technology. For applications that don’t require complex trust requirement compliance, it’s normal enough for companies to manage things via public cloud solutions.

However, too many companies fail to estimate the current and future trust requirements of applications, and bear the brunt later on, when these requirements become obvious. By correctly mapping applications to the right cloud computing methodology, enterprises can prevent expensive and embarrassing trust requirement compliance issues. It also helps them identify the applications that need sophisticated access control and authentication management. Hence, this becomes a matter of avoiding security breaches, as well as avoiding unnecessary recurring costs for enterprises.


Inadequate Diligence in Vendors’ Disaster Recovery Practices

In a hybrid cloud ecosystem, there are different vendors and their different databases in play. Questions to be asked:

Whether or not there is complete failover between data centers?

Do all vendors own the different data centers involved?

Among all the different disaster recovery and failover SLAs you establish with vendors, are most of them logically in sync with each other?

Note: After all the research and diligence, your company needs to be reasonably convinced that business continuity will be ensured in the case of a service disruption.


Concluding Remarks

‘Best of all worlds’ is what every enterprise wants. From a cloud computing perspective, this translates into what’s popularly called a hybrid cloud. Indeed, from a control and cost perspective, it offers the best of all worlds. However, this often also means the coming together of individual infrastructure risks, as well as the ones caused by their integration. Some of these risks are covered in this guide; keep them in mind while planning the hybrid transformation for your enterprise.




Author: Rahul Sharma

Adopting Privacy by Design to Meet GDPR Compliance

The proliferation of social networking and collaboration tools has ushered in a new era of the remote enterprise workforce; however, they have also made organizational boundaries non-static. Making it increasingly difficult to safeguard the confidential and personal data of their business partners, employees and customers. In theses political uncertain times, defending privacy is paramount to the success of every enterprise. The threats and risks to data are no longer theoretical; they are apparent and menacing. Tech decision makers have to step in-front of the problem and respond to the challenge. Adopting the privacy by design framework is a surefire way of protecting all users from attacks on their privacy and safety.

The bedrock of privacy be design (PbD) is the anticipation, management and prevention of privacy issues during the entire life cycle of the process or system. According to the PbD philosophy, the most ideal way to mitigate privacy risks is not creating them to begin with. Its architect, Dr. Ann Cavoukian, contrived the framework to deal with the rampant issue of developers applying privacy fixes after the completion of a project. The privacy by design framework has been around since the 1990s, but it is yet to become mainstream. That will soon change. The EU’s data protection overhaul, GDPR which comes into effect in May 2018, demands privacy by design as well as data protection by default across all applications and uses. This means that any organization that serves EU residents has to adhere to the newly set data protection standards regardless of whether they themselves are located within the European Union. GDPR has made a risk-based approach to pinpointing digital vulnerabilities and eliminating privacy gaps a requirement.

Privacy by Default

Article 25 of the General Data Protection Regulation systematizes both the concepts of privacy by design and privacy be default. Under the ‘privacy by design’ requirement, organizations will have to setup compliant procedures and policies as fundamental components in the maintenance and design of information systems and mode of operation for every organization. This basically means that privacy by design measures may be inclusive of pseudonymization or other technologies that are capable of enhancing privacy.

Article 25 states that a data controller has to implement suitable organizational and technical measures at the time a mode of processing is determined and at the time the data is actually processed, in order to guarantee data protection principles like data minimization are met.

Simply put, Privacy by Default denotes that strict privacy settings should be applied by default the moment a service is released to the public, without requiring any manual input from the user. Additionally, any personal data provided by the user to facilitate the optimal use of a product must only be kept for the amount of time needed to offer said service of product. The example commonly given is the creation of a social media profile, the default settings should be the most privacy-friendly. Details such as name and email address would be considered essential information but not location or age or location, also all profiles should be set to private by default.

Privacy Impact Assessment (PIA)

Privacy Impact Assessments are an intrinsic part of the privacy by design approach. A PIA highlights what personally Identifiable Information is collected and further explains how that data is maintained, how it will be shared and how it will be protected. Organizations should conduct a PIA to assess legislative authority and pinpoint and extenuate privacy risks before sharing any personal information. Not only will the PIA aid in the design of more efficient and effective processes for handling personal data, but it can also reduce the associated costs and damage to reputation that could potentially accompany a breach of data protection regulations and laws.

The most ideal time to complete a Privacy Impact Assessment is at the design stage of a new process or system, and then re-visit it as legal obligations and program requirements change. Under Article 35 of the GDPR, data protection impact assessments (DPIA) are inescapable for companies with processes and technologies that will likely result in a high risk to the privacy rights of end-users.

The Seven Foundational Principals of Privacy by Design

The main objective of privacy by design are to ensure privacy and control over personal data. Organization can gain a competitive advantage by practicing the seven foundational principles. These principles of privacy by design can be applied to all the varying types of personal data. The zeal of the privacy measures typically corresponds to the sensitivity of the data.

I. Proactive not Reactive; Preventative not Remedial – Be prepared for, pinpoint, and avert privacy issues before they occur. Privacy risks should never materialize on your watch, get ahead of invasive events before the fact, not afterward.
II. Privacy as the default setting – The end user should never take any additional action to secure their privacy. Personal data is automatically protected in all business practices or IT systems right off the bat.
III. Privacy embedded into design – Privacy is not an after thought, it should instead be part and parcel of the design as a core function of the process or system.
IV. Full functionality (positive-sum, not zero sum) – PbD eliminates the need to make trade-offs, and instead seeks to meet the needs of all legitimate objectives and interests in a positive-sum manner; circumventing all dichotomies.
V. End-to-end lifestyle protection – An adequate data minimization, retention and deletion process should be fully-integrated into the process or system before any personal data is collected.
VI. Transparency and visibility – Regardless of the technology or business practice involved, the set privacy standards have to be visible, transparent and open to providers and users alike; it should also be documented and independently verifiable.
VII. Keep it user-centric – Respect the privacy of your users/customers by offering granular privacy options, solid privacy defaults, timely and detailed information notices, and empowering user-friendly options.

In Closing

The General Data Protection Regulation makes privacy by design and privacy by default legal requirements in the European Union. So if you do business in the EU or process any personal data belonging to EU residents you will have to implement internal processes and procedures to address the set privacy requirements. A vast majority of organizations already prioritize security as part of their processes. However, becoming fully compliant with the privacy by design and privacy by default requirement may demand additional steps. This will mean implementing a privacy impact assessment template that can be populated every time a new system is procured, implemented or designed. Organizations should also revisit their data collection forms to make sure that only essential data is being collected. Lastly it will be prudent to set up automated deletion processes for specific data, implementing technical measures to guarantee that personal data is flagged for deletion after it is no longer required. FileCloud checks all the boxes when it comes to the seven principles of privacy by design and offers granular features that will set you on the path to full GDPR compliance. Click here for more information.

Author Gabriel Lando

image courtesy of

Technical Data Under ITAR


The International Traffic in Arms Regulations (ITAR) are controls established by the U.S State Department to regulate the temporary import and export of defense articles. While most defense contractors comprehend the implications of ITAR to physical objects, ITAR’s application to data remains unclear to most. The first step to properly identifying technical data and how its controlled for export purposes is having a concise understanding of what technical data is and what it encompasses.

Technical data refers to the unique information required for the development, production and subsequent use of defense articles.

  • Development – is inclusive of all the information that is created or gathered before production and may include but is not limited to: layouts, pilot production schemes, testing and assembly prototypes, design research, integration design, configuration design, design concepts, design analysis, and other forms of design data.
  • Production – is comprised of all the information generated or gathered during the production stages and may include but is not limited to: engineering, manufacture, assembly, integration, testing, inspection and quality assurance.
  • Use – encompasses any information that relates to the installation, operation, maintenance, testing or repair of defense articles.

Technical data also refers to classified data that relates to defense services and defense articles.

Implications of Cloud Computing on Technical Data

The cloud facilitates access to information while expanding the delivery of services. On the other hand, ITAR aims to restrict the flow of information while limiting the provision of services and goods. The contrast between the two creates unique challenges as it relates to compliance for defense contractors who have operations in multiple countries and wish to adopt cloud computing. Some organizations have opted to avoid the cloud altogether and fall back to maintaining separate systems in order to meet ITAR requirements, which tends to be extremely inefficient and costly. In order to fully understand the possible implications of cloud computing on export controlled data, you must first understand what constitutes an export when it comes to technical data.

I. What is an Export?

In global trade, the term export is typically synonymous with large shipping crates being loaded onto ships or wheeled into a large transoceanic cargo plane. However, U.S export control laws are not limited to the movement of hardware across borders. Instead, the regulations also extend to specific technical data. The type of control extended depends on the export control jurisdiction and classification. Export Administration Regulations (EAR) defines an export as the shipment or transmission of items out of the United States, or release of software or technology to a foreign national within the U.S. The ITAR definition of export is analogous.

Technical data is regulated for reasons of foreign policy, non-proliferation and national security; the current law stipulates that technical data should be stored in the U.S and that only authorized U.S persons should have access to it. The existing definition of export was drafted at a time when cloud computing was not in the picture, therefore, the exact application of the term ‘export’ in this space remains unclear.

II. When Does an Export Occur?

When it comes to export control, transmitting data to a cloud platform for storage or manipulation is conceptually similar to carrying a hard copy of the data to another country or sending it via the mail. Transmitting data to the cloud for backup or processing mainly involves copying the data to a remote server. If the server’s location is outside the United States; then uploading export-controlled technical data to it will be deemed and export, as if it had been printed on paper and carried outside the country. This creates an appreciable challenge since, with the cloud, the end-user is not axiomatically privy to the location of the data, and the locations of the cloud server are subject to change.It is important to note that export controlled data doesn’t have to leave the U.S to be considered an export. Under ITAR, technical data should not be disclosed to non-US persons regardless of where they are located, without authorization. Non-US persons encompass any individual who isn’t a lawful permanent resident of the United States. When technology subject to ITAR is uploaded to a cloud server, regardless of whether the provider has made sure that all servers are located within the U.S, and a user from another country accesses it; an export has occurred. Even though the data never left the United States.

III. Who is the Exporter?

Users of cloud services interact with the cloud in multifarious ways; in most cases, the operational specifics are intentionally abstracted by the service provider. Information relating to where the computations are occurring may not be made available to the end-user. However, in the United States, the cloud service provider is generally not considered the exporter of the data that it’s subscribers upload to its servers. Despite the fact that the State Department hasn’t issued a formal directive on the matter, U.S subscribers that upload technical data onto the hardware of a cloud service provider will be considered the exporters of said data in the event of foreign disclosures. Aptly, if ITAR controlled technical data is divulged to a non-US IT administrator of the cloud service provider, it is the subscriber to the service and not the service provider that is deemed the exporter.

In Closing

The cloud has reshaped the landscape with respect to government, business, and consumer information technologies by delivering enhanced flexibility and better cost efficiencies for a vast variety of services. But the nature of cloud computing increases the chances of inadvertent export control violations. When it comes to ITAR controlled technical data, users are inadvertently vulnerable to unexpected and complex export requirements, and in the event of non-compliance, to drastic potential criminal and civil penalties, including weighty fines and possibly jail time. With that in mind, the next logical suggestion would be to forget cloud file sharing and sync altogether; however, that does not have to be in the case. The Bureau of Industry and Security published a rule in the Federal Register that establishes a ‘carve out’ for the transmission of regulated data within a cloud service infrastructure necessitating encryption of the data. Encryption coupled with a set of best practices can enable you to freely adopt the cloud while remaining ITAR compliant.




Author: Gabriel Lando



Cloud Security Threats That Will Keep CISOs Busy in 2018


As cloud computing continues to strengthen its hold over enterprise IT services market, the concerns around organizational readiness to address the growing security challenges also keep on escalating. Invariably, the shared and on-demand nature of cloud services gives way to security risks. Whether it’s a general expansion in the exposed threat surface area, or some very specific cloud computing-related security issues – 2018 will definitely require that enterprise use the services of their CISOs to manage the growing risks. In this guide, we’ve covered the most pressing of these concerns for you to understand and plan your enterprise’s cloud security strategy around.

Lack of Understanding of Shared Security Responsibilities

One of the major problems that hit CISOs hard is the realization that their cloud service provider is not exactly 100% responsible for the complete security of the workload. Enterprises believe that since their workloads are being managed in the cloud, they can simply forget the security aspects of the same. The truth, however, is that cloud service providers are not responsible or under any obligation for ensuring the security of the workload beyond what the contract mentions. Data retention, backup, security, and resilience – all come in the purview of the enterprise’s responsibility for cloud security, and not that of the vendor. CISOs would do well to understand the cloud service vendor’s model of shared security responsibility. Almost always, companies need to implement extended and additional security measures to secure their cloud data.

Insiders with Malicious Intents

All it takes is a disgruntled employee to bring down the IT systems of an enterprise; that’s sad but true. Because the average enterprise has more than a few cloud computing service vendors, it also means your employees have that many cloud-based applications to manage their work around. Single sign-on is a practical option for companies. However, it also means that malicious insiders can use their position to access and mess up applications.

To make sure that their cloud apps remain secure, enterprises need to invest in access and identity management processes and capabilities. Also, they need to work with cloud vendors to implement behavior analysis based alert mechanisms. These mechanisms can identify suspicious user behavior and trigger alerts, apart from blocking access upon detection.

Failure in Due Diligence while Hiring and Onboarding New Cloud Service Vendors

More and more business applications are now being delivered via the cloud. This obviously means that IT managers will find themselves in boardrooms, being pitched dozens of cloud solutions.

Here, due diligence could be a deal maker or breaker as far as the success of the enterprise-vendor relationship goes. CISOs have a very clear role to play here and must be closely associated with the IT vendor onboarding process. Now is the perfect time to start working on building a thorough checklist of pre-requisites that vendors must meet to qualify for your company’s business. CISOs also must work along with their counterparts from active vendors to ensure the right fit among systems from both sides.

A missed step in the due diligence before signing off a contract with a cloud vendor could come back to haunt your company very soon.

Human Errors

Though enterprises strive to make their IT and business applications immune against user errors, the risks remain real. Also, users of cloud-based applications are always on the radar of cybercriminals. CISOs have to ask themselves – are the end users sufficiently secure against phishing and social engineering attacks?

To make sure that a naive employee doesn’t end up being the cause of an application outage, CISOs need to lead IT efforts towards improving the cybersecurity knowledge of end users.

Insecure Application Programmer Interfaces

Application programmer interfaces (APIs) are key enablers of integration of cloud services with all kinds of on-premise and third-party applications that a business uses. In the very recent past, there’s been a lot of focus on delivering advanced APIs to enterprises to enable them to self-service requests. APIs are also the system components where monitoring, management, and provisioning can be managed by users.

In 2018, it’s expected that the range and capabilities of APIs will expand, bringing more enterprise IT consultants and technicians within the purview of API relevant user groups. This, however, must be done with close oversight of the CISO or one of his/her close aides. The reason – APIs invariably become contribute to the threat surface area of enterprise cloud infrastructure. Companies need specific additional measures to prevent deliberate or accidental attempts to circumvent policies.

Account Hijacking

Though account hijacking is not something specifically associated with cloud computing, it’s certain that cloud computing does add a lot to the threat surface area. The reason is that cloud services are accessed via user accounts, and each new account becomes a risk variable in the cloud security equation. If hackers are able to hijack a user account, they can use the credentials to:

  • Record transaction information
  • Manipulate data
  • Eavesdrop on business communications
  • Redirect users to suspicious websites
  • Execute advanced phishing attacks on hundreds of owners of similar accounts
  • Access critical cloud computing settings and configurations
  • Block legitimate access requests
  • Return false information to data requests

Advanced Persistent Threats

Like a parasite, some cyber attacks persist for long duration, attempting to infiltrate target systems and establish a stronghold within the IT processes and workloads of the victim systems. The worse part of APT attacks is that they stealthily grow aware of evolving security measures and can alter their responses accordingly. Once APTs become a part of a system, they can move laterally and start stealing information from cloud workloads.

Concluding Remarks

As more data and more applications move to the cloud, the role of the enterprise CISO in ensuring security becomes crucial. 2018 will throw all kind of security challenges at enterprises, particularly related to cloud infrastructure. The threats mentioned in this guide are the ones that warrant the CISO’s attention.

How MSPs Leverage Infrastructure as a Service (IaaS)

Many enterprises mainly rely on MSPs to manage their technology, and the deployment of cloud-based solutions with the help of a trusted managed service provider is rapidly becoming the norm. Enterprise architecture innovation leaders can greatly benefit from utilizing high-quality managed services when implementing and operating IaaS solutions on Google cloud platform, Microsoft Azure and Amazon Web Services. The size of the opportunity for MSPs to support enterprises during and after their migration to the cloud is massive. The analysts at 451 research predict that cloud managed services will grow to a $43 billion market by 2018.

The arrival of massive scale platforms built by Google, Microsoft and Amazon has completely changed the enterprise infrastructure world. They are now capable of operating on a scale and efficiency that is virtually impossible to match. This coupled with an incomparable geographic scope has led to the creation of a critical mass of customers and an ecosystem of partners. The only way for the reseller market to survive the rapidly expanding cloud usage is to adapt. The market will be driven by its ability to meet the demands that arise from the surging complexity of technology, the network-dependency of infrastructure and applications, and the practices of the ever-more mobile workforce.

The Enterprise Cloud is Hybrid

One of the main motivations to keep workloads on-premises is typically the lack of ability to move them as-is or simply enterprise liability. As a result, more enterprises are opting for a hybrid deployment so that they can enjoy the efficacy and cost-saving benefits of a public cloud coupled with the security and control that comes with a private cloud. However, despite the fact that buying instances on AWS or Azure is a simple task; the skills needed to build, deploy and run an application is much more complex, various intricacies are bound to arise.

A recent survey championed by Microsoft revealed that 38 percent of people involved in the recruitment process of professionals with cloud skills in the last 12 months found it difficult to find the right skills. The survey went ahead to state that even as the number of professionals with right cloud-related skills continue to grow, the demand for those skills will likely increase at a faster rate than the available supply. According to Gartner’s Magic Quadrant for Cloud Infrastructure as a Service, 2016; most customers start off by selecting a cloud platform that suits their workload, then look for an MSP to manage it. As opposed to finding a ‘managed-cloud’ solution from an MSP that offers basic IaaS capabilities on its platform. Customers also tend to extend existing managed services to include the management of a third party cloud IaaS offering.

If these trends are anything to go by, we can conclude that enterprise IT decision makers first select their IaaS, and only later realize their collective lack of skills to create a robust, enterprise environment. And the market is reflecting this change. In the 2017 Magic Quadrant for Public Cloud Infrastructure report, Gartner surmised that 75 percent of effective implementations will be fulfilled by innovative, highly skilled MSPs with a cloud-native, DevOps-Centric service delivery approach. This tiny but growing group of managed service providers is filling a gap in a specific industry vertical.

Big Data Requires Big Performance

Information is power, and this digital age, data is everywhere. The struggle of how best to leverage this data rages on – but IaaS is almost always a part of the discussion. IBM recently reported that 2.5 million terabytes of data is produced daily. The inundating volume of information present a unique opportunity to both small and large business equipped to take advantage of it. According to the International Data Cooperation (IDC), the big data market is set to record a $48.6 billion growth by 2019. And a growing number of Managed Service Providers are positioning themselves to net a notable portion of that revenue.

Analyzing large datasets demands more than simply placing a few extra servers and hard disk arrays into an organization’s data center. A large majority of large data projects tend to fail because on-premises technology is too arduous to optimize and deploy, and sizing it rarely accurate. Most enterprises, large and small alike, simply lack the time, budget, staff, or bluntly, the interest to support and develop their own big data infrastructure. MSP can position themselves as infrastructure partners who not only manage by also provide a high performance, reliable foundation for short and long term big data needs.

Axiomatic Cost Benefits

With IaaS, MSPs instantly have the ability to provide enterprise-grade infrastructure without investing a ton of cash to deploy their own cloud. They also benefit from the latest hardware maintenance and updates while maintaining full control of sever usage – so scalability will never be a concern. Removing or adding servers is a breeze and you only have to pay for you use. The most appealing thing about leveraging IaaS is that it provides all the flexibility you require to offer services that cater to your client’s specific needs. At the end of it all, you credibility as an MSP is boosted because you are capable of providing reliable service

In Closing

For MSP’s who wish to deliver cloud services but feel that public cloud services are too impuisant to pricing pressures and lack sufficient privacy and security controls; utilizing Infrastructure as a Service can ease the time to market and reduce the costs associated with implementing a private cloud solution. As a matter of fact, some IaaS providers currently sell their solutions to both end-user consumers as well as to managed service providers.

Author: Gabriel Lando

FileCloud Empowers Government Agencies with Customizable EFSS on AWS GovCloud (U.S.) Region

FileCloud, a cloud-agnostic Enterprise File Sharing and Sync platform, today announced availability on AWS GovCloud (U.S.) Region. FileCloud is one of the first full-featured enterprise file sharing and sync solutions available on AWS GovCloud (U.S.), offering advanced file sharing, synchronization across OSs and endpoint backup. With this new offering, customers will experience the control, flexibility and privacy of FileCloud, as well as the scalability, security and reliability of Amazon Web Services (AWS). This solution allows federal, state and city agencies to run their own customized file sharing, sync and backup solutions on AWS GovCloud (U.S.).

“Having FileCloud available on AWS GovCloud (U.S.) provides the control, flexibility, data separation and customization of FileCloud at the same time as the scalability and resiliency of AWS,” said Madhan Kanagavel, CEO of FileCloud. “With these solutions, government agencies can create their own enterprise file service platform that offers total control.”

Government agency and defense contractors are required to adhere to strict government regulations, including the International Traffic in Arms Regulations (ITAR) and the Federal Risk and Authorization Management Program (FedRAMP). AWS GovCloud (U.S.) is designed specifically for government agencies to meet these requirements.

By using FileCloud and AWS GovCloud (U.S.), agencies can create their own branded file sharing, sync and backup solution, customized with their logo and running under their URL. FileCloud on AWS GovCloud offers the required compliance and reliability and delivers options that allow customers to pick tailored cloud solutions. FileCloud is a cloud-agnostic solution that works on-premises or on the cloud.

“FileCloud allows us to set up a secure file service, on servers that meet our clients’ security requirements,” said Ryan Stevenson, Designer at defense contractor McCormmick Stevenson. “The easy-to-use interfaces and extensive support resources allowed us to customize who can access what files, inside or outside our organization.”

Try FileCloud for free!

Top 10 Predictions in Content Collaboration for 2018

Collaboration within the workplace is not a new concept. However, it has become increasingly crucial in this mobile world as we become more connected across the globe. The proliferation of cloud computing has given rise to a new set of content collaboration tools such as Dropbox, FileCloud, Box. These tools enable employees to effectively collaborate, subsequently leading to a more skilled, engaged and educated workforce. Content collaboration solutions allow employees within the organization to easily share information with each other, and effectively work together on projects irrespective of geographic location via a combination of networking capabilities, software solutions, and well-established collaborative processes. Content collaboration platforms are the evolution of Enterprise File Sharing and Sync (EFSS).
… You can read the full article at VMBlog.