Archive for the ‘Cloud Computing’ Category

Technical Data Under ITAR

 

The International Traffic in Arms Regulations (ITAR) are controls established by the U.S State Department to regulate the temporary import and export of defense articles. While most defense contractors comprehend the implications of ITAR to physical objects, ITAR’s application to data remains unclear to most. The first step to properly identifying technical data and how its controlled for export purposes is having a concise understanding of what technical data is and what it encompasses.

Technical data refers to the unique information required for the development, production and subsequent use of defense articles.

  • Development – is inclusive of all the information that is created or gathered before production and may include but is not limited to: layouts, pilot production schemes, testing and assembly prototypes, design research, integration design, configuration design, design concepts, design analysis, and other forms of design data.
  • Production – is comprised of all the information generated or gathered during the production stages and may include but is not limited to: engineering, manufacture, assembly, integration, testing, inspection and quality assurance.
  • Use – encompasses any information that relates to the installation, operation, maintenance, testing or repair of defense articles.

Technical data also refers to classified data that relates to defense services and defense articles.

Implications of Cloud Computing on Technical Data

The cloud facilitates access to information while expanding the delivery of services. On the other hand, ITAR aims to restrict the flow of information while limiting the provision of services and goods. The contrast between the two creates unique challenges as it relates to compliance for defense contractors who have operations in multiple countries and wish to adopt cloud computing. Some organizations have opted to avoid the cloud altogether and fall back to maintaining separate systems in order to meet ITAR requirements, which tends to be extremely inefficient and costly. In order to fully understand the possible implications of cloud computing on export controlled data, you must first understand what constitutes an export when it comes to technical data.

I. What is an Export?

In global trade, the term export is typically synonymous with large shipping crates being loaded onto ships or wheeled into a large transoceanic cargo plane. However, U.S export control laws are not limited to the movement of hardware across borders. Instead, the regulations also extend to specific technical data. The type of control extended depends on the export control jurisdiction and classification. Export Administration Regulations (EAR) defines an export as the shipment or transmission of items out of the United States, or release of software or technology to a foreign national within the U.S. The ITAR definition of export is analogous.

Technical data is regulated for reasons of foreign policy, non-proliferation and national security; the current law stipulates that technical data should be stored in the U.S and that only authorized U.S persons should have access to it. The existing definition of export was drafted at a time when cloud computing was not in the picture, therefore, the exact application of the term ‘export’ in this space remains unclear.

II. When Does an Export Occur?

When it comes to export control, transmitting data to a cloud platform for storage or manipulation is conceptually similar to carrying a hard copy of the data to another country or sending it via the mail. Transmitting data to the cloud for backup or processing mainly involves copying the data to a remote server. If the server’s location is outside the United States; then uploading export-controlled technical data to it will be deemed and export, as if it had been printed on paper and carried outside the country. This creates an appreciable challenge since, with the cloud, the end-user is not axiomatically privy to the location of the data, and the locations of the cloud server are subject to change.It is important to note that export controlled data doesn’t have to leave the U.S to be considered an export. Under ITAR, technical data should not be disclosed to non-US persons regardless of where they are located, without authorization. Non-US persons encompass any individual who isn’t a lawful permanent resident of the United States. When technology subject to ITAR is uploaded to a cloud server, regardless of whether the provider has made sure that all servers are located within the U.S, and a user from another country accesses it; an export has occurred. Even though the data never left the United States.

III. Who is the Exporter?

Users of cloud services interact with the cloud in multifarious ways; in most cases, the operational specifics are intentionally abstracted by the service provider. Information relating to where the computations are occurring may not be made available to the end-user. However, in the United States, the cloud service provider is generally not considered the exporter of the data that it’s subscribers upload to its servers. Despite the fact that the State Department hasn’t issued a formal directive on the matter, U.S subscribers that upload technical data onto the hardware of a cloud service provider will be considered the exporters of said data in the event of foreign disclosures. Aptly, if ITAR controlled technical data is divulged to a non-US IT administrator of the cloud service provider, it is the subscriber to the service and not the service provider that is deemed the exporter.

In Closing

The cloud has reshaped the landscape with respect to government, business, and consumer information technologies by delivering enhanced flexibility and better cost efficiencies for a vast variety of services. But the nature of cloud computing increases the chances of inadvertent export control violations. When it comes to ITAR controlled technical data, users are inadvertently vulnerable to unexpected and complex export requirements, and in the event of non-compliance, to drastic potential criminal and civil penalties, including weighty fines and possibly jail time. With that in mind, the next logical suggestion would be to forget cloud file sharing and sync altogether; however, that does not have to be in the case. The Bureau of Industry and Security published a rule in the Federal Register that establishes a ‘carve out’ for the transmission of regulated data within a cloud service infrastructure necessitating encryption of the data. Encryption coupled with a set of best practices can enable you to freely adopt the cloud while remaining ITAR compliant.

 

 

 

Author: Gabriel Lando

 

 

Cloud Security Threats That Will Keep CISOs Busy in 2018

 

As cloud computing continues to strengthen its hold over enterprise IT services market, the concerns around organizational readiness to address the growing security challenges also keep on escalating. Invariably, the shared and on-demand nature of cloud services gives way to security risks. Whether it’s a general expansion in the exposed threat surface area, or some very specific cloud computing-related security issues – 2018 will definitely require that enterprise use the services of their CISOs to manage the growing risks. In this guide, we’ve covered the most pressing of these concerns for you to understand and plan your enterprise’s cloud security strategy around.

Lack of Understanding of Shared Security Responsibilities

One of the major problems that hit CISOs hard is the realization that their cloud service provider is not exactly 100% responsible for the complete security of the workload. Enterprises believe that since their workloads are being managed in the cloud, they can simply forget the security aspects of the same. The truth, however, is that cloud service providers are not responsible or under any obligation for ensuring the security of the workload beyond what the contract mentions. Data retention, backup, security, and resilience – all come in the purview of the enterprise’s responsibility for cloud security, and not that of the vendor. CISOs would do well to understand the cloud service vendor’s model of shared security responsibility. Almost always, companies need to implement extended and additional security measures to secure their cloud data.

Insiders with Malicious Intents

All it takes is a disgruntled employee to bring down the IT systems of an enterprise; that’s sad but true. Because the average enterprise has more than a few cloud computing service vendors, it also means your employees have that many cloud-based applications to manage their work around. Single sign-on is a practical option for companies. However, it also means that malicious insiders can use their position to access and mess up applications.

To make sure that their cloud apps remain secure, enterprises need to invest in access and identity management processes and capabilities. Also, they need to work with cloud vendors to implement behavior analysis based alert mechanisms. These mechanisms can identify suspicious user behavior and trigger alerts, apart from blocking access upon detection.

Failure in Due Diligence while Hiring and Onboarding New Cloud Service Vendors

More and more business applications are now being delivered via the cloud. This obviously means that IT managers will find themselves in boardrooms, being pitched dozens of cloud solutions.

Here, due diligence could be a deal maker or breaker as far as the success of the enterprise-vendor relationship goes. CISOs have a very clear role to play here and must be closely associated with the IT vendor onboarding process. Now is the perfect time to start working on building a thorough checklist of pre-requisites that vendors must meet to qualify for your company’s business. CISOs also must work along with their counterparts from active vendors to ensure the right fit among systems from both sides.

A missed step in the due diligence before signing off a contract with a cloud vendor could come back to haunt your company very soon.

Human Errors

Though enterprises strive to make their IT and business applications immune against user errors, the risks remain real. Also, users of cloud-based applications are always on the radar of cybercriminals. CISOs have to ask themselves – are the end users sufficiently secure against phishing and social engineering attacks?

To make sure that a naive employee doesn’t end up being the cause of an application outage, CISOs need to lead IT efforts towards improving the cybersecurity knowledge of end users.

Insecure Application Programmer Interfaces

Application programmer interfaces (APIs) are key enablers of integration of cloud services with all kinds of on-premise and third-party applications that a business uses. In the very recent past, there’s been a lot of focus on delivering advanced APIs to enterprises to enable them to self-service requests. APIs are also the system components where monitoring, management, and provisioning can be managed by users.

In 2018, it’s expected that the range and capabilities of APIs will expand, bringing more enterprise IT consultants and technicians within the purview of API relevant user groups. This, however, must be done with close oversight of the CISO or one of his/her close aides. The reason – APIs invariably become contribute to the threat surface area of enterprise cloud infrastructure. Companies need specific additional measures to prevent deliberate or accidental attempts to circumvent policies.

Account Hijacking

Though account hijacking is not something specifically associated with cloud computing, it’s certain that cloud computing does add a lot to the threat surface area. The reason is that cloud services are accessed via user accounts, and each new account becomes a risk variable in the cloud security equation. If hackers are able to hijack a user account, they can use the credentials to:

  • Record transaction information
  • Manipulate data
  • Eavesdrop on business communications
  • Redirect users to suspicious websites
  • Execute advanced phishing attacks on hundreds of owners of similar accounts
  • Access critical cloud computing settings and configurations
  • Block legitimate access requests
  • Return false information to data requests

Advanced Persistent Threats

Like a parasite, some cyber attacks persist for long duration, attempting to infiltrate target systems and establish a stronghold within the IT processes and workloads of the victim systems. The worse part of APT attacks is that they stealthily grow aware of evolving security measures and can alter their responses accordingly. Once APTs become a part of a system, they can move laterally and start stealing information from cloud workloads.

Concluding Remarks

As more data and more applications move to the cloud, the role of the enterprise CISO in ensuring security becomes crucial. 2018 will throw all kind of security challenges at enterprises, particularly related to cloud infrastructure. The threats mentioned in this guide are the ones that warrant the CISO’s attention.

How MSPs Leverage Infrastructure as a Service (IaaS)

Many enterprises mainly rely on MSPs to manage their technology, and the deployment of cloud-based solutions with the help of a trusted managed service provider is rapidly becoming the norm. Enterprise architecture innovation leaders can greatly benefit from utilizing high-quality managed services when implementing and operating IaaS solutions on Google cloud platform, Microsoft Azure and Amazon Web Services. The size of the opportunity for MSPs to support enterprises during and after their migration to the cloud is massive. The analysts at 451 research predict that cloud managed services will grow to a $43 billion market by 2018.

The arrival of massive scale platforms built by Google, Microsoft and Amazon has completely changed the enterprise infrastructure world. They are now capable of operating on a scale and efficiency that is virtually impossible to match. This coupled with an incomparable geographic scope has led to the creation of a critical mass of customers and an ecosystem of partners. The only way for the reseller market to survive the rapidly expanding cloud usage is to adapt. The market will be driven by its ability to meet the demands that arise from the surging complexity of technology, the network-dependency of infrastructure and applications, and the practices of the ever-more mobile workforce.

The Enterprise Cloud is Hybrid

One of the main motivations to keep workloads on-premises is typically the lack of ability to move them as-is or simply enterprise liability. As a result, more enterprises are opting for a hybrid deployment so that they can enjoy the efficacy and cost-saving benefits of a public cloud coupled with the security and control that comes with a private cloud. However, despite the fact that buying instances on AWS or Azure is a simple task; the skills needed to build, deploy and run an application is much more complex, various intricacies are bound to arise.

A recent survey championed by Microsoft revealed that 38 percent of people involved in the recruitment process of professionals with cloud skills in the last 12 months found it difficult to find the right skills. The survey went ahead to state that even as the number of professionals with right cloud-related skills continue to grow, the demand for those skills will likely increase at a faster rate than the available supply. According to Gartner’s Magic Quadrant for Cloud Infrastructure as a Service, 2016; most customers start off by selecting a cloud platform that suits their workload, then look for an MSP to manage it. As opposed to finding a ‘managed-cloud’ solution from an MSP that offers basic IaaS capabilities on its platform. Customers also tend to extend existing managed services to include the management of a third party cloud IaaS offering.

If these trends are anything to go by, we can conclude that enterprise IT decision makers first select their IaaS, and only later realize their collective lack of skills to create a robust, enterprise environment. And the market is reflecting this change. In the 2017 Magic Quadrant for Public Cloud Infrastructure report, Gartner surmised that 75 percent of effective implementations will be fulfilled by innovative, highly skilled MSPs with a cloud-native, DevOps-Centric service delivery approach. This tiny but growing group of managed service providers is filling a gap in a specific industry vertical.

Big Data Requires Big Performance

Information is power, and this digital age, data is everywhere. The struggle of how best to leverage this data rages on – but IaaS is almost always a part of the discussion. IBM recently reported that 2.5 million terabytes of data is produced daily. The inundating volume of information present a unique opportunity to both small and large business equipped to take advantage of it. According to the International Data Cooperation (IDC), the big data market is set to record a $48.6 billion growth by 2019. And a growing number of Managed Service Providers are positioning themselves to net a notable portion of that revenue.

Analyzing large datasets demands more than simply placing a few extra servers and hard disk arrays into an organization’s data center. A large majority of large data projects tend to fail because on-premises technology is too arduous to optimize and deploy, and sizing it rarely accurate. Most enterprises, large and small alike, simply lack the time, budget, staff, or bluntly, the interest to support and develop their own big data infrastructure. MSP can position themselves as infrastructure partners who not only manage by also provide a high performance, reliable foundation for short and long term big data needs.

Axiomatic Cost Benefits

With IaaS, MSPs instantly have the ability to provide enterprise-grade infrastructure without investing a ton of cash to deploy their own cloud. They also benefit from the latest hardware maintenance and updates while maintaining full control of sever usage – so scalability will never be a concern. Removing or adding servers is a breeze and you only have to pay for you use. The most appealing thing about leveraging IaaS is that it provides all the flexibility you require to offer services that cater to your client’s specific needs. At the end of it all, you credibility as an MSP is boosted because you are capable of providing reliable service

In Closing

For MSP’s who wish to deliver cloud services but feel that public cloud services are too impuisant to pricing pressures and lack sufficient privacy and security controls; utilizing Infrastructure as a Service can ease the time to market and reduce the costs associated with implementing a private cloud solution. As a matter of fact, some IaaS providers currently sell their solutions to both end-user consumers as well as to managed service providers.

Author: Gabriel Lando

FileCloud Empowers Government Agencies with Customizable EFSS on AWS GovCloud (U.S.) Region

FileCloud, a cloud-agnostic Enterprise File Sharing and Sync platform, today announced availability on AWS GovCloud (U.S.) Region. FileCloud is one of the first full-featured enterprise file sharing and sync solutions available on AWS GovCloud (U.S.), offering advanced file sharing, synchronization across OSs and endpoint backup. With this new offering, customers will experience the control, flexibility and privacy of FileCloud, as well as the scalability, security and reliability of Amazon Web Services (AWS). This solution allows federal, state and city agencies to run their own customized file sharing, sync and backup solutions on AWS GovCloud (U.S.).

“Having FileCloud available on AWS GovCloud (U.S.) provides the control, flexibility, data separation and customization of FileCloud at the same time as the scalability and resiliency of AWS,” said Madhan Kanagavel, CEO of FileCloud. “With these solutions, government agencies can create their own enterprise file service platform that offers total control.”

Government agency and defense contractors are required to adhere to strict government regulations, including the International Traffic in Arms Regulations (ITAR) and the Federal Risk and Authorization Management Program (FedRAMP). AWS GovCloud (U.S.) is designed specifically for government agencies to meet these requirements.

By using FileCloud and AWS GovCloud (U.S.), agencies can create their own branded file sharing, sync and backup solution, customized with their logo and running under their URL. FileCloud on AWS GovCloud offers the required compliance and reliability and delivers options that allow customers to pick tailored cloud solutions. FileCloud is a cloud-agnostic solution that works on-premises or on the cloud.

“FileCloud allows us to set up a secure file service, on servers that meet our clients’ security requirements,” said Ryan Stevenson, Designer at defense contractor McCormmick Stevenson. “The easy-to-use interfaces and extensive support resources allowed us to customize who can access what files, inside or outside our organization.”

Try FileCloud for free!

Top 10 Predictions in Content Collaboration for 2018

Collaboration within the workplace is not a new concept. However, it has become increasingly crucial in this mobile world as we become more connected across the globe. The proliferation of cloud computing has given rise to a new set of content collaboration tools such as Dropbox, FileCloud, Box. These tools enable employees to effectively collaborate, subsequently leading to a more skilled, engaged and educated workforce. Content collaboration solutions allow employees within the organization to easily share information with each other, and effectively work together on projects irrespective of geographic location via a combination of networking capabilities, software solutions, and well-established collaborative processes. Content collaboration platforms are the evolution of Enterprise File Sharing and Sync (EFSS).
… You can read the full article at VMBlog.

The Intelligent Cloud : Artificial intelligence (AI) Meets Cloud Computing

intelligent cloud

If you thought that mobile communications and the Internet have drastically changed the world, just wait. Coming years will prove to be even more disruptive and mind-blowing.  Over the last few years, cloud computing has been lauded as the next big disruption in technology and true to the fact it has become a mainstream element of modern software solutions just as common as databases or websites; but is there a next phase for cloud computing? is it an intelligent cloud?

Artificial intelligence (AI) is the type of technology with the capacity to not only enhance current cloud platform incumbents but also power an entirely new generation of cloud computing technologies. AI is moving beyond simple chat applications like scheduling support and customer service, to impact the enterprise in more profound ways; as automation and intelligent systems further develop to serve the purpose of critical enterprise functions. AI is bound to become ubiquitous in every industry where decision-making is being fundamentally transformed by ‘Thinking Machines’. The need for smarter and faster decision making and the management of big data is the driving factor behind the trend.

Remember Moore’s Law? In 1965, Intel’s co-founder, Gordon Moore observed that the transistors per square inch on integrated circuits had doubled in number each year since their invention. For the next 50 years, Moore’s Law was maintained. In the process, multiple sectors like robotics and biotechnology saw remarkable innovation because machines that ran on computers and computing power all became faster and smaller with time as the transistors on the integrated circuits became more efficient. Now, something even more extraordinary is happening. Accelerating technologies such as big data and artificial intelligence are converging to trigger the next major wave of change. This ‘digital transformation’ will reshape every aspect of the enterprise, including cloud computing.

Artificial intelligence (AI) is expected to burgeon in the enterprise in 2017. Several IT players, including today’s top IT companies, have heavily invested in the space with plans to increase efforts in the foreseeable future.

Despite the fact that AI has been around since the 60’s, advances in networking and graphic processing units, along with demand for big data, have put it back at the forefront of several companies’ minds and strategies. Given the recent explosion of data from Internet of Thing (IoT) and applications, and the necessity for quicker, real-time decision making, AI is well on its way to becoming a key differentiator and requirement for major cloud providers.

AI-First Enterprises

In a market that has for the longest time been dominated by four major companies – IBM, Amazon, Microsoft, and Google –an AI first approach has the potential to disrupt the current dynamic.

“I think we will evolve in computing from a mobile-first to an AI-first world.”

-Sundar Pichai, Chief executive of Google

The consumer world is not new to AI-based systems; products like Siri, Cortana and Alexa have been making our lives easier for a while now. However, the enterprise applications for AI are completely different. An AI first enterprise approach should be designed to allow business leaders and data professionals to organize, collect, secure and govern data efficiently so they can gain the insights they require to become a cognitive business. In order to maintain a competitive advantage, businesses today have to get insights from data; however, acquiring those insights is complex and requires work from skilled data scientists. The ability to predict strategic and tactical purposes has evaded enterprises due to prohibitive resource requirements.

Cloud computing solves the two largest hurdles for AI in the enterprise; abundant, low cost computing and a means to leverage large volumes of data.

AI-as-a-service?

Today, this new breed of Platform as a Service (AIaaS) can be applied on all the data that enterprises have been collecting. Major cloud providers are making AI more accessible “as-a-service” via open source platforms. For enterprises with an array of complex issues to solve, the need for disparate platforms working together can’t be ignored. This is why making machine learning and other variations of AI applications and technology available via open source is critical to the enterprise. By leveraging AI-as-a-service, businesses can innovate solutions that solve infinite problems.

As machine learning becomes more popular as a service, organizations will have to decide the level at which they want to be involved. While the power of cognitive intelligence is undeniably high, wanting to use it and being able to use it are two completely different things. For this reason, most companies will opt to use a PaaS vendor to manage their entire cycle of data intelligence as opposed to an in-house attempt, allowing them to focus on powering and developing their applications. When looking for an AI provider, you have to ask the right questions. The ideal vendor should be in a position to elucidate both how they handle data and how they intend to solve your specific enterprise problem.

There are multiple digital trends that have the potential to be disruptive; the only way to guarantee smarter business processes, more agility, and increased productivity is by planning ahead for the change and impact that is coming. The main differentiating factor between competing vendors in this space will be how the technology is applied to improve business processes and strategies.

Author: Gabriel Lando

Image Courtesy: pexels.com

Data ownership in the cloud – How does it affect you?

The future of the cloud seems bright, Cisco predicts that by 2018, 59% of cloud workloads will be created from Software As A Service (SaaS). While these statistics are optimistic, we cannot ignore a few concerns that stifle cloud adoption efforts, such as data ownership.

Most people would be inclined to say that they still own data in the cloud. While they may be right in some sense, this is not always the case. For instance, let us look at Facebook, which many people use as cloud storage to keep their photos. According to the Facebook end-user-agreement, the company stores data for as long as it is necessary, which might not be as long as users want. This sadly means that users lose data ownership. Worse still, the servers are located in different locations, in and out of the United States, subjecting data to different laws.

According to Dan Gray, as discussed in ‘Data Ownership In The Cloud,’ the actual ownership of data in the cloud may be dependent on the nature of the data owned and where it was created. He states that there is data created by a user before uploading to the cloud, and data created on the cloud platform. He continues to say that data created prior to cloud upload may be subject to copyright laws depending on the provider, while that created on the platform could have complicated ownership.

In addition to cloud provider policies, certain Acts of Congress, although created to enhance data security and still uphold the nation’s security, have shown how data ownership issues affect businesses. Two of these, the Stored Communications Act (SCA) and the Patriot Act show the challenges of cloud data ownership and privacy issues, with regards to government access to information stored in the cloud.

The Stored Communications Act (SCA)

Usually, when data resides in a cloud provider’s infrastructure, user owner rights cannot be guaranteed. And even when users are assured that they own their data, it does not necessarily mean that the information stored there is private. For example, the United States law, through the Stored Communications Act (SCA), gives the government the right to seize data stored by an American company even if it is hosted elsewhere. The interpretation of this saw Microsoft and other technology giants take the government to court, claiming that it was illegal to use the SCA to obtain a search warrant to peruse and seize data stored beyond the territorial boundaries of the United States.

Microsoft suffered a blow when a district court judge in New York ruled that the U.S government search powers extend to data stored in foreign servers. Fortunately, these companies got a reprieve mid-2016, when the Second Circuit ruled that a federal court may not issue a criminal warrant to order a U.S cloud provider to produce data held in servers in Ireland.  It is however, important to note that this ruling only focused on whether Congress intended for the SCA to apply to data held beyond U.S.A territory, and did not touch on issues to deal with Irish data privacy law.

The Patriot Act

The Patriot Act was put into place in 2001 as an effort by George Bush government to fight terrorism. This act allowed the Federal Bureau of Investigation (FBI) to search telephone, e-mail, and financial records without a court order, as well as expanded law enforcement agencies access to business records, among other provisions. Although many provisions of this Act were set to sunset 4 years later, the contrary happened. Fast-tracking to 2011, President Barrack Obama signed a 4-year extension of 3 key provisions in the Act, which expanded the discovery mechanisms law enforcement would use to gain third-party access. This progress brought about international uproar especially from the European Union, causing the Obama administration to hold a press conference to quell these concerns.

The situation was aggravated when a Microsoft UK director admitted that the Patriot Act could access EU based data, further disclosing that no cloud service was safe from the ACT, and the company could be forced to hand over data to the U.S government. While these provisions expired on June 1 2015, due to lack of congressional approval to renew, the government found a way to renew them through the USA freedom Act.

The two Acts show us that data owned in the cloud, especially public cloud, is usually owned by the cloud providers. This is why we are seeing the laws asking cloud providers to provide this information, and not cloud users.

What To Do In Light Of These Regulations

Even if the SCA has been ruled illegal as not to be used to get warrants to retrieve data stored in the cloud, and the USA freedom Act is purported by some parties as a better version of the Patriot Act, we cannot ignore the need for cloud users to find a way to avoid such compulsions.

One idea users could have is escaping the grasp of these laws, which is unfortunately impractical. To completely outrun the government, you would have to make sure that neither you nor the cloud service used has operations in the United States. This is a great disadvantage because most globally competitive cloud providers are within the United States jurisdiction. Even when you are lucky and find a suitable cloud provider, it is may still be subject to a Mutual Legal Assistance Treaty (MLAT) request. Simply, put, there is no easy way out.

Instead, understand the risks and let your clients know. For example, if the Patriot Act extension attempts were successful, financial institutions would be obliged to share information with law enforcement agencies on suspicion of terrorist activities. In such a case, a good financial institution would warn its clients of these risks before hand. Alternatively, you can find a way of storing data in-house, forcing the feds to go through you and not the cloud provider.

Conclusion                                                       

Truthfully, data ownership in the cloud is a complicated issue.  Determined by both government and company policies, data ownership in the cloud is not always retained.  Gladly, depending on data policies and how they categorize data in the cloud, a user could be granted full ownership. In the event that this doesn’t happen, prepare for instances of third-party access and infringement of complete privacy, hence rethink your business strategy. In short, as a cloud services client, please pay attention to the contract that you sign with your provider and understand the laws under which the provider operates.

Author: Davis Porter

HIPAA Prescribed Safeguards for File Sharing

health care data governance

The Health Insurance Portability & Accountability Act (HIPAA) sets standards for protecting sensitive data of patients in the cloud. Any company which is dealing with PHI (protected health information) needs to ensure all of the required network, physical, and process safety measures are properly followed. If you want to learn more about requirements of HIPPA, click here to learn more.

This includes CE (covered entities), anyone who is providing treatment, operations, and payment in health care, BA (business associates) with access to patient’s information stored in the cloud or those who provide support in payment, operations, or treatment. Subcontractors and associates of associates need to be in compliance too.

Who needs HIPAA?

The privacy rule of the HIPAA helped address the saving, sharing, and accessing of personal and medical data of individuals stored in the cloud while the security rule is more specifically meant for outlining national security standards to help protect the health data which is received, maintained, transmitted, or created electronically, also known as e-PHI (electronic protected health information).

Technical and physical safeguards

If you’re hosting data with HIPAA compliant hosting providers, they need to have particular administrative, technical, and physical safeguards in place as per the US HHS (Department of Health & Human Services). The technical and physical safeguards which are the most important for services provided by hosts are listed below:

  • Physical safeguards include limited facility control or access with authorized access procedures. All entities need to be HIPAA compliant, need to have policies regarding the use and access of electronic media and workstations. This includes removing, transferring, reusing, and disposing of e-PHI and electronic media.
  • Technical safeguards should only allow authorized users to access e-PHI. Access control will include the use of unique user ID’s, emergency access procedures, automatic logoffs, decryption, and encryption.
  • Tracking logs or audit reports need to be implemented in order to keep a record of activity on software or hardware. This is very useful when it comes to pinpointing the source or the cause of security violations.
  • Technical policies need to be used for covering integrity controls and measures should be in place to confirm e-PHI has not been destroyed or altered. Offsite backup and IT disaster recovery are very important in order to ensure any electronic media errors or failures can quickly be remedied, and health information of patients can be recovered intact and accurately.
  • Transmission or network security is the last safeguard needed of HIPAA compliant hosts in order to protect them against any unauthorized access or use of e-PHI. This concerns all of the methods for transmitting data, whether it is over the internet, email, or even on private networks, like a private cloud.

A supplemental act passed in 2009 known as the HITECH (Health Information Technology for Economic & Clinical Health) Act which supported the enforcement of all of the HIPAA requirements by increasing the penalties imposed on organizations who violated the HIPAA privacy or security rules. The HITECH Act was created in response to the development of health technology and increased storage, transmission, and use of electronic health information.

HIPAA has driven a number of healthcare providers to search for solutions that can help them secure cloud data. Medical information is very private, and regulation keeps getting tighter, which means enforcement is also getting tighter. There are a number of healthcare providers have chosen to move their whole EHRs onto a HIPAA compliant platform such as FileCloud in order to reduce their expenses and become more inter-operable across various other devices in a safe, HIPAA-compliant fashion.

 

Author: Rahul Sharma

images courtesy: freedigitalphotos.net/ Stuart Miles

Comparison Between FTP And The Cloud 

What Is FTP?

File Transfer Protocol (FTP) is a standard network protocol that is used to transfer computer files between a server and client on a computer network. Since it is built on a client-server model architecture, FTP uses separate control and data connections between the client and the server.

Usually, FTP users authenticate themselves with a clear-text sign-in protocol, that basically comprises of a username and password. In the event that the server is configured to allow it, users can also connect anonymously.

What Is The Cloud?

Generally, the cloud could be defined as computing on the internet. We can also define it as the practice of using remote servers hosted in the internet to store, manage and process data, as opposed to using local servers.

To access the cloud, a user only needs a computer with a web browser and of course internet connection. Better yet, the cloud allows you to remotely access all data and software from anywhere using any device, without having to store anything on their computer.

While the cloud and FTP are all methods of file sharing, they have significant similarities and differences as we will discuss below:

Similarities Between FTP and the Cloud

Both are online file transfer or sharing modes, with cloud computing featuring more advanced and secure features. Actually, FTP can be considered as an integral part of cloud computing because cloud services also offer the option to upload and share your files from anywhere, and opt to store these files in your computer.

Both FTP and cloud services are costly, with differences dependent on servers for example. You can also enjoy free FTP or cloud services.

Differences Between FTP and the Cloud

  • Mode of access

To use FTP, you require an FTP client to access your files and folders. On the other hand, all you need is a web browser or an application to access files and software in the cloud.

  • Cost

It will cost you $5000 to set up your own FTP server, exclusive of maintenance charges. Following reports that cloud computing price for enterprises has dropped by two-thirds since 2014, users can enjoy cloud services for as low as $0.12 per hour.

  • Security

FTP is reportedly insecure especially due to the fact that it offers no traceability. There is no way to see who accessed what information. This loophole has made it easy for cyber criminals to hack into FTP servers, retrieve shared information and leave without a trace.

Cloud services have solved the issue of security in many ways such as the ability to track and report access to your files. Additionally, you are able to create backups in case of data loss.

Why The Cloud Is A Better Option

  • More secure. You can store your confidential information with reduced risk of hacking incidents. No wonder more than 50% of businesses say that their organization currently transfers sensitive and confidential information to the cloud.
  • It is cheaper yet has more advanced features
  • Remote accessibility
  • Cloud is cost-effective for your business. It has been said 80% of cloud adopters saw improvements within 6 months of moving to the cloud. In fact, 82% of companies have reportedly saved money by moving to the cloud.

5 Reasons FileCloud Is The Best Option

File cloud is a private cloud hosting solution, meaning that it resides on your company’s intranet or your hosted data center and is protected behind a firewall. With this on-premises Enterprise File Sharing and Sync (EFSS) solution, your servers are safely run by your own trusted administrators, using your infrastructure.

Here are 5 reasons why you should install FileCloud:

  1. Secure

File cloud is regulated by corporate IT security policies which ensure that you receive the highest form of possible protection. The biggest advantage with private cloud is the ability and ease to be compliant with Sarbanes Oxley, PCI and HIPAA.

While only 26% of cloud services meet the EU data residency rule for example, FileCloud makes it easier for you by giving cloud control to your internal IT department.

  1. You Can Easily Share Files

As a cloud service, FileCloud allows you to map and sync files, which can then be accessed as though they were within your local drive. FileCloud consists of an intuitive UI that enables you to share your documents with your clients and edit them real-time, saving you the hassle of sending emails back and forth.

  1. You Can Easily Access Your Files

You don’t have to always travel with your laptop in order to access and edit your documents. This is because FileCloud developers have enabled the service to allow you to remotely access your files from any location, using different types of devices. Additionally, the developers have made sure that you can sync your documents and access them offline.

Better yet, you can use different avenues to access these documents, using WebDAV, desktop sync, virtual drive and mobile app among others.

  1. FileCloud Offers Best Mobile Connectivity

With an average rating of 4+ stars across all platforms such as Android, iOS and Windows phone, FileCloud app, allows you to effortlessly work on your documents on your mobile phone, despite the size of the document.

  1. FileCloud is Cheaper

Compared to other cloud services such as dropbox, FileCloud guarantees your business savings of up to $42,000 annually. For example, a company that implements FileCloud to serve 500 users will save $42,000 annually while a company that serves 25 users will save $2000 annually, making FileCloud 3 to 4 times cheaper than Dropbox.

Final Word

While FTP is designed to help you share files across computers, eliminating the need to carry devices like flash drives around, the cloud offers a better solution. It is more advanced, primarily designed with your file security in mind and ensures that your documents are safe, with very minimal risk of data breach.

If you are looking for a secure FTP replacement, click here to learn more about FileCloud – a modern alternative to FTP. Make your shift from FTP to the cloud seamless by installing FileCloud today. Develop your content in one location and edit it on the go from any device, anywhere in the world.

Author: Davis Porter

Why Universities Pick FileCloud for File Sharing and Sync?

university cloud file sharing

The cloud has graduated from a trend to be analyzed, to a reality that must be embraced. Many organizations have begun implementing these new technologies to further reduce costs through reduced infrastructure costs, time, and administration and improved machine utilization. We have already seen how multiple sectors can benefit from adopting cloud-based enterprise solutions including financial services, the public sector, and charities among others. So are institutions of higher education ready to embrace the cloud? Is this a key vertical missing out on seeing cloud computing reduce IT infrastructure costs?

The basic idea behind cloud computing is combining computing resources – be that applications, storage, servers or networks – in a shared pool. It is one of the new innovation topics in the field of IT that can benefit enterprises as well as organizations.

Higher Learning: Reach for the Clouds

Higher education undoubtedly plays a crucial role in the molding of the structure of our societies. Only competent higher education institutions can produce responsible individuals who will be instrumental in ensuring that the society grows as it should. The only way to guarantee that these venerable institutions of higher education are on par with the latest global and social trends is constantly updating their technologies to ensure that it is developing alongside the latest technological advances all over the globe.

Cloud computing has completely altered the way institutions do business and serve customers. For higher education, it grants the ability to serve not just administrators and educators, but also students, who have their own technological devices, needs and expectations. By eliminating the usual IT constraints – like limited maintenance resources, costly infrastructure improvements, and incompatibilities between tools and systems, cloud computing frees up institutions of higher learning to shift their focus from maintaining IT infrastructure to addressing student and staff’s needs.

Consumer grade services such as Google Apps, Google Mail or Dropbox are growing in acceptance despite being considered insecure, because they are cost saving alternatives. However, choosing a public cloud offering can create risk around security, interoperability, performance, and privacy. Universities should consider on-premise cloud deployment models to realize lower costs, scale on demand, and rapid platform deployment while minimizing risk.

The State of Higher education in the cloud

While administrators have cautiously approached cloud computing in the past, the trend has since changed.  More institutions of higher learning are adopting cloud services to address varying challenges. According to MeriTalk, the size of the cloud market in higher education is worth $4.4 billion. This further reflected in the findings below:

  • >20%: Percentage of higher education institutions using cloud-based platforms for core enterprise applications
    Source: Ovum
  • 31%: Of colleges that adopt cloud-based solutions do so for storage; 29% for messaging and video conferencing; 25% to allocate computing power
    Source: CDW
  • 36%: Percentage of universities that have uploaded at least 1TB of data to the cloud; 12% have uploaded between 10TB to 100TB of data
    Source: Xsede
  • 33%: Average percentage of budget that higher education IT teams expect to spend on the cloud in four years (as of 2013)
    Source: CDW

Like enterprises that are rapidly adopting cloud services, universities are in search of improved and more cost-effective ways of implementing IT services, without the frustrating cost of upgrades and maintenance. Like enterprises, universities want to realize the full potential of their data to make informed strategic decisions. And like enterprises, universities want to respond more quickly to new opportunities, without taking several months to implement business-critical software applications.

The Unique Needs of Higher Education

Despite the similarities, institutions of higher learning have a unique mission and culture that affects how decisions regarding cloud computing are made.

  • Student Centric. The focus of every higher learning institution is the students. In this digital age of mobile devices; students are likely to bring their own devices and expectations about when and how they wish to use them. The IT department is tasked with providing improved interoperability between student and campus platforms. In order to maintain high levels of communication and collaboration, students must have round-the-clock access to secure, reliable networks; and the ability to create, deliver, and share content campus-wide on any number of devices.
  • Complex finance models. Most universities have precarious and complex financial models, composed of varying combinations of research grants, public funding, investments, philanthropy and tuition.
  • Participatory decision-making model. Another important piece of the puzzle is the governance of higher education, modeled on a participatory culture the typically precludes –or at least complicates—top-down decision making. Reaching a consensus takes time, especially when stake holders are dealing with decisions about where to allocate limited resources. In such an environment, the need to completely understand the advantages that cloud-based services models can provide becomes apparent.

Developing a cloud strategy for strategy requires drawing certain principles from the business community. However, creating a durable and effective digital strategy also requires creating a framework built around the needs of your unique stake holders – from faculty to students, alumni to board members; engaging every stakeholder early in the process; and creating a strategy that addresses IT challenges that are specific to higher education and your campus.

Private clouds are built for the exclusive use of one organization, offering the utmost control over quality of service, security and data.

The University of South Carolina Chooses FileCloud

The director of IT services at the University of South Carolina was in search of a file sharing solution with the following requirements.

  • Store large training videos and share with athletes and coaches
  • Large storage at affordable price
  • On-premise cloud to protect intellectual Property and prevent IP getting to public cloud
  • Mobile apps to support BYOD (bring your own device)
  • Granular controls to provide various levels of permission

After trying FileCloud for 15 days, they decided it was the best solution for the university. Read the full case study here.

Why FileCloud?

  • Reliable Remote Access

In this digital age of consumer technologies, social applications and smartphones, expectations from all constituents—staff, faculty, and students—have never been higher. The need for quick informed 24/7 service has become apparent. Students are bringing multiple devices to campus; and in such an environment, quick access to content and collaboration tools that augment research, teaching and learning is of great importance.

FileCloud enables all constituents to remotely access data from anywhere using any mobile device. The real-time sync across network folder allows for remote access. With comprehensive features for WebDAV, Mobile Apps, Virtual Drive, Desktop Sync and Web access, end users are bound to have a seamless experience regardless of the device they are using. FileCloud clients are available for iOS, Android, Blackberry, and Windows. Additionally, remote files can be opened and edited via third party applications on your device. Stored files can also be shared via link or email attachment.

  • Expanded Capabilities

FileCloud enables even smaller institutions with limited resources and leanly-staffed IT departments to leverage the resources of the cloud; providing more security for critical systems and more robust disaster recovery and business continuity. With features like automated server backup, high availability and multi-tenant setups, FileCloud can easily be scaled to large deployments. Transitioning to FileCloud gives IT staff a framework for security programs, disaster recovery and core business continuity.

  • Cost Containment

FileCloud offers simple pricing options. Adopting a cloud strategy does not automatically mean dramatically reduced IT costs; however with proper management costs can be contained over time. With unlimited client accounts for external contractors and vendors you will only have to pay for employee accounts.

Click here to find out why FileCloud is the leading private file access, sync and sharing solution for institutions of higher learning.

Conclusion

The question of whether or not institutions of higher learning should adopt the cloud has already been answered – by students, staff and faculty, who have already started using cloud services in one form or the other. IT teams in higher education should stop asking if but when and how they can start the transition. The successful adoption of cloud services hinges on finding a partner who understands higher education and the sea of benefits it can deliver.

Learn more about FileCloud, a leading enterprise file sharing and file sync software.