Archive for the ‘Private Cloud’ Category

Best Alternatives to OwnCloud in 2020

One of the latest enterprises IT trends is private cloud storage. A self-hosted cloud platform that allows companies to enjoy all the features they can get from a traditional cloud storage platform like Dropbox. However, everything is controlled in-house, which means you have control over the security of your data. This is particularly important today as there have been some security breach incidents involving high profile cloud storage platforms.


private file share


Whether you invest in hardware to build a server or you rent a server space, you’ll need a software to create your private cloud storage platform. OwnCloud is arguably the number one software for self-hosted cloud storage. This open-source software is used by over 50 million people around the globe. OwnCloud provides everything you need to make the most of your private cloud storage platform. This includes file and folder sharing, sync, access on all devices, editing documents from the web, support for third-party storage platforms, and much more.

While OwnCloud is a good option for companies looking to create a private cloud storage platform, it has its downsides. For example, you cannot run OwnCloud on Windows servers. This is a major issue since many companies use Windows Server instead of Linux. With that said, we’ll explore some of the top alternatives to OwnCloud in this post. Many of these platforms appear to be a better option to OwnCloud because they provide all the same features that you get from OwnCloud, and more. Also, they are more affordable.


FileCloud is one of the best alternatives to OwnCloud. Not only does it come with all the standard features that you will get from OwnCloud, but FileCloud also supports both Windows and Linux servers. Apart from support for Windows, there are many other things you can enjoy from FileCloud, which OwnCloud does not support. This includes multi-tenancy hosting, the ability to run applications on Docker, local storage support, endpoint backup, support for Microsoft Office Add-on (you can edit documents on the cloud with Office Online Server), free unlimited users on the cloud, built-in ransomware protection, and so much more.

One of the main perks of choosing FileCloud is its simple user interface. If you are worried that the less tech-savvy people in your organization may struggle to get the hang of the UX of your private cloud storage software, then choose FileCloud. FileCloud is designed with a focus on security and accessibility. There are many security and governance compliance tools at your disposal on this platform. In addition, you’d spend less when you use FileCloud compared to OwnCloud. While you need to pay around $3,600 annually for 50 users on OwnCloud, the rate for the same number of users is just $2,500 on FileCloud.

Recently, FileCloud has also launched community edition where 5 users can use full power of FileCloud, with FREE UNLIMITED number of limited user accounts. The community edition costs only $10 per year (All proceeds donated to charity).


NextCloud is the most popular open-source software for private cloud storage. It is the number one option for many people, apart from considering OwnCloud. NextCloud was launched in 2016 by Karl Franklitschek, the founder of OwnCloud, after he left the company. Which results to NextCloud having many of the same features that you will find on OwnCloud. However, it gives you a bit more like editing documents online, locking files to prevent anyone from editing them as well as audio/video chat.

NextCloud prioritizes security. This software comes with a lot of powerful tools that allows

you to take control of the safety of your data. Whether you want to use NextCloud for your home or for your business, there is a version of this software for you. It is optimized to ensure that you get the best out of your cloud storage platform. There are three different subscription packages for NextCloud, and the features you get varies depending on which option you choose. The Standard plan for 50 users costs $3,700 annually, and the Premium plan for 50 users costs $5,400 annually.

Also check our detailed feature comparison – OwnCloud vs NextCloud.


Seafile is an open-source private cloud storage software that is rated over OwnCloud by many people. One of the great feature of Seafile is that it is a true open-source software; you can find its source code on GitHub. Seafile has the same features you will find on other software for self-hosted cloud storage platforms. However, it also gives you more like the ability to sync and encrypt an entire library separately, the ability to resume a transfer if it is interrupted, the option to create password-protected links that you can share with clients, real-time notifications, offline access to file, the ability to edit documents with WYSIWYG Markdown editor, and more.

Seafile is designed to support productivity in organizations. The Community Edition of Seafile is free. However, the features are limited. The Professional Edition of this software costs around $2,400 for 50 users.


Pydio is another excellent alternative to OwnCloud. This self-host cloud storage software used to be known as AjaXplorer. Pydio is one of the largest open-source projects in the world, and there’s a lot in store for you to explore. Since rebranding as Pydio, there have been attempts to revamp this software, and this has not gone well with some users who complained about bugs. Nonetheless, Pydio is one of the best private cloud storage solutions out there. Pydio supports storage and syncing with different devices. You can choose which files to sync. What’s more, Pydio allows you to share files publicly online with people who don’t have access to the platform. This software is compatible with Mac, Windows, and Linux.

Pydio has some great security provisions, including EncFS encryption (for data stored on your server) and SSL encryption (for data in transit). Compared to OwnCloud, Pydio is a better option for people who work with large files. Another major advantage of this software is its flexibility. You can customize it based on your requirements. Pydio provides GDPR-compliant logs and other tools to ensure you can meet governance requirements.

The basic version of Pydio is free with limited features, this is recommended for home users. The annual subscription rate for up to 50 users is $2,750.

What are the Differences Between On-Premise, Hybrid and Online File Sharing

File sharing is vital for every organization. In the digital age, there are different options open to organizations – on-premise file sharing, online or cloud file sharing, and hybrid file sharing. The question is which of these options is most advantageous for your organization.


As you can probably deduce from its name, on-premise file sharing involves hosting files on an organization’s IT infrastructure. This method of file sharing was very popular before the advent of cloud computing, and it is still widely used today.

With on-premise file sharing, apps and files are shared between computers through the local area network (LAN). Your IT team takes full responsibility for the network’s security as well as its performance. On-premise file sharing is quite straightforward. Some companies choose this option because of the sensitive nature of the information that they deal with, for example, financial firms. While on-premise file hosting and sharing can be safe, it is not entirely immune to risks. This is why it is essential to use the latest security protocols including encryption as well as antivirus and anti-ransomware protection.

Advantages of On-Premise File Hosting and Sharing

  1. You have complete control over your data without leaving it in the hands of a third-party.
  2. File sharing is faster and does not depend on an internet connection.
  3. Some apps work better on-premise than over the cloud.
  4. On-premise file hosting and sharing provide a lot of customization options.
  5. If your server is not connected to the internet, the chances of a data breach are drastically reduced.
  6. On-premise file storage and sharing make it easier to fulfill regulatory requirements in some sectors.

Disadvantages of On-Premise File Hosting and Sharing

  1. It can be expensive to get the necessary equipment to create a data center, which is why this is not the best option for some startups. Also, you have to maintain an IT team and need physical space for your data center, which adds up to more cost.
  2. On-premise file hosting and sharing make you more vulnerable to data loss if something happens to your data center.

Online File Sharing

Online file sharing is the trend today. Chances are that you are depending on cloud storage one way or another. Cloud file sharing involves hosting and sharing your files on a remote server. You can easily access your data through any device connected to the internet.

Online file sharing is a relatively inexpensive option as the duty of securing and managing the IT infrastructure is in the hands of your cloud service provider. There are many cloud platforms available today. Despite recent security incidents, cloud storage can be very safe if the proper security measures are deployed. For example, data must be encrypted while in storage and during transfer. It should be noted that cloud file sharing is not only limited to documents but also applications. You can access an application over the cloud without having to install it on your device.

Advantages of Online File Hosting and Sharing

  1. It makes your work more flexible. With online file hosting and storage, you can work remotely and hire people from different parts of the world.
  2. It is not costly to set up and operate. You do not have to maintain a large IT department or spend on equipment for an in-house data center.
  3. You can access your data using any device.
  4. Cloud platforms can be challenging to hack.
  5. You can easily backup data.

Disadvantages of Online File Hosting and Sharing

  1. If you do not choose a cloud platform
    that prioritizes on security, your data may be at risk.
  2. You need an internet connection to
    access your data.

Hybrid File Sharing

Hybrid file sharing combines features of on-premise and online file sharing. This means you’d still maintain your in-house IT infrastructure. However, you also use cloud storage. This may seem redundant, but there are many advantages of hybrid file storage and sharing. For example, with cloud sharing, your workers operating remotely can easily access files.

On the other hand, workers operating within your office premises can send and receive files faster using LAN connection. The cloud platform could also serve as a backup for your data. Alternatively, you can choose to store sensitive data on-premises and other files on the cloud.

Essentially, hybrid file sharing gives you the best of both worlds. This option is usually reserved for large organizations who have the necessary resources.

Advantages of Hybrid File Hosting and Sharing

  1. It improves flexibility in organizations, which translates to better efficiency.
  2. It provides guaranteed protection against data loss.
  3. Hybrid file sharing gives you more options to manage the security of your files.

Disadvantages of Hybrid File Hosting and Sharing

  1. It is a costly option. The cost of setting up and maintaining an in-house data center as well as a cloud storage account can be enormous for some organizations.

Which Option is Best?

There is no best option when selecting how to manage data in your organization. As you can see, each option has its advantages and disadvantages. The method to choose depends on your needs. Some organizations may be best suited for on-premises file sharing, while others may do best with cloud sharing or hybrid cloud sharing. The first step to determine which option to choose is to consult an IT expert to carry out an assessment of your organization’s data management needs.

On the surface, hybrid file sharing may be the best option. Most modern organizations tend to go for hybrid file sharing because it makes it easier to hire offshore experts in addition to all the other benefits it provides. However, the cost involved makes this unsuitable for small startups.

Whether you choose on-premise, hybrid or online file sharing, FileCloud is the best partner to help you make the most of your data management system. We provide support all the way – from implementation to usage. This includes encryption, two-factor authentication, data loss prevention, and much more.

Author : Rahul Sharma

Why US Government Organizations Should Move to Private Cloud



Since its inception, cloud computing has managed to transform the business landscape in unforeseen ways. While the private sector has been capitalizing on the multiple benefits of cloud computing for a while now; government organizations have also aggressively started to embrace the cloud. As it stands, the IT environment of most government organizations is typified by poor asset utilization, duplicative processes, a fragmented demand for resources, poorly managed environments, and prolonged delays in getting things done. The end result is an in-efficacious system that has a negative impact on the organization’s ability to serve the American public. The innovation, agility and cost benefits of a private cloud computing model can significantly enhance government service delivery. A move to the cloud for government organizations directly translates to public value, by improving operational efficiency and the response time to constituent needs.

 The Cloud First Initiative

In February 2011, the first Federal CIO, Vivek Kundra, announced cloud first. The policy was presented as a crucial aspect of government reform efforts to achieve operational efficiencies by cutting the waste and help government agencies deliver constituent services in a more streamlined and faster way. Up to 2014, the adoption rate was slow. A 2014 report by the U.S Government Accountability Office showed that only 2% of IT spending went towards cloud computing that year. However, the tide has shifted in recent years. Agencies across the federal government have espoused cloud computing solutions and architectures to facilitate services to constituents and reduce the reliance on the large-scale, traditional IT infrastructure investments.

Currently, AWS reports that GovCloud, has grown 221% year-over-year since it was launched in 2011. Microsoft also claims that Microsoft Cloud for Government, which includes office 365 government, Dynamics CRM, and Azure Government. Has attracted over 5.2 million users. Despite its palpable success, Cloud First has had its share of critics, including those censorious of the trouble-prone launch of Critics have blamed the perceived slow adoption on the lack of federal technical experience in cloud deployments. Below are some of the compelling reasons why government agencies should adopt a private cloud computing model.

I. Reduced Infrastructure Costs

By consolidating server footprints via virtualization and cloud efforts, govt agencies significantly reduce the cost of IT ownership. Agencies that operate in house IT gear have to deal with data center security on top of hardware, software and network maintenance. These are all resource intensive workloads that cloud vendors handle on behalf of their clients. The minute an agency offloads all of it, it free itself up to focus on the particular capabilities and features it has to offer. Private cloud computing solutions are typically bundled with asset management, threat and fraud prevention and detection, and monitoring programs. Adopting a private cloud model enables government agencies to become agile and responsive towards changing business conditions.

II. Big Data Consensus

The IDC reports that approximately 2.5 exabytes of data is produced on a daily basis. Government agencies have a ton of data and having a human look at all of if is virtually impossible. The old model of data distribution greatly diminishes that data’s value to end-users, and ultimately to the taxpayer. A private cloud computing model is the answer to big data analysis. Tools that utilize artificial intelligence, machine learning and natural language processing can be used to quickly and accurately examine terabytes of data for anomalies and patterns; thus helping federal officials to make informed decisions. Additionally, once data has been made available via the cloud, its is readily accessible, meaning resource requests that previously took months to processes can be handled in a short time.

III. Data Sovereignty and Regional Concerns

When it comes to the cloud, ownership of data assets leads to more questions. Erosion of information asset ownership is undoubtedly a potential concern when resources are moved to any external system –public cloud included. There is an inherent difference between being responsible for data as a custodian and having complete ownership of it. Despite the fact that legal data ownership stays with the originating data owner, a potential area of concern with a public cloud deployment is that the cloud vendor may acquire both roles. The EU has been at the forefront to clear up the confusion and on the 25th of May 2018 will introduce a directive that establishes new rules to aid its citizens retain full control over personal data.

Another area of concern includes the complex legal, technical and governance issues that surround hosting government data in varying jurisdictions. Governments are known to like concrete boarders; but the cloud is global, it transcends physical spaces and boarders. Since the services exist globally, and users can interact and share data remotely; what states or municipalities are responsible for the data? Whose laws apply or don’t apply to any given exchange?

US government agencies have to adopt cloud strategies aimed at retaining sovereignty over government data. For any government agency seeking flexible and scalable data center solutions, a private cloud deployment can tie a range of integrated and end-to-end solutions that leverage cloud capabilities together. With a private cloud, the complexity of legal and government regulations are taken out of the equation. The data is maintained by the govt agency employees and is made available via internally-managed technology platforms or SaaS solutions like FileCloud. The ownership or jurisdiction of the data is no longer in question.

IV. You Deployed it, Now Secure It

Security is typically the top concern for federal IT managers when it comes to the migration of applications and data into the cloud. Governments understand that information is power, data is a crucial asset. Federal agencies represent a huge chunk of the globes largest data repositories, ranging from tax, employment, weather, agriculture and surveillance data among others. A recent study by MeriTalk revealed that only one in five of the Federal IT professionals surveyed believe that the security offered by cloud providers is sufficient for federal data. However, the same study also concluded that 64 percent of federal IT managers are more likely to place their cloud-based applications in a private cloud.

Why private cloud? Control. A private cloud deployment meets the required, strict security needs with more resource control and data isolation. Government organizations have to send and receive sensitive information while ensuring it’s only accessible to authorized users. Additionally, that have to maintain control of each user’s read and write rights to said data. Public cloud solutions simply don’t fit the bill for most govt agencies because the deployed applications and data have to remain completely under agency control. Private cloud solutions enable govt agencies to leverage their existing security infrastructure, while staying in control of their data. Since the deployment functions within your existing framework, the need to reinvent govt processes or security policies is eliminated.

Fed RAMP (Federal Risk and Authorization Management Program) standardizes security services and streamlines assessments so that any cloud vendor being considered by federal agencies is only evaluated once at a federal level. Safeguarding the security and integrity of data falls upon individual government organizations. A private cloud model gives organizations better performance and security control over the physical infrastructure that underlies its virtual servers.

V. Cross Agency Collaboration

Government agencies require a digital terrain through which to comfortably and confidently collaborate across, irrespective of agency or department. For example, different agencies may need to share compliance data, regulatory documents, case information or disaster response plans. For optimal collaboration efficacy, these resources have to be accessible to the workers within their respective organizations, to outside contractors, and the general public, when needed. Government agencies can leverage the security infrastructures and on-premises directories of a private cloud. Ensuring that sensitive data remains within the control of the organization, and only authorized persons have access to it. A private gov-cloud allows government organizations to collaborate both internally and across extended ecosystems in a compliant, secure and audit-able manner.

VI. Citizen Service Delivery

Most local, state and federal government agencies offer a variety of citizen services. Cloud computing helps in the delivery of those services and subsequently improves the lives of citizens on all those level. For example, enabling constituents to monitor water and energy consumption encourages them to be more vigilant about their usage. Quick and transparent access to service requests such as loans and application improves awareness and inclusion. A private cloud computing model is an ideal way of empowering and informing citizens.

In Closing

Cloud computing delineates an amazing opportunity to drastically revolutionize how government organizations manage, processes and share information. Although addressing all the challenges associated with cloud adoption can seem ominous, especially if a government organization lacks the expertise in cloud migration and deployment. Nevertheless, its clear that government agencies wish to perpetuate high standards of privacy, security and cost management, in their pursuit to transform operations into a flexible, dynamic environment. The most ideal solution for them is a private cloud.

Author: Gabriel Lando


Backup Mistakes That Companies Continue to Commit



Imagine a situation where you wake up, reach your office, and witness the chaos. Because your business applications are not working anymore. And that’s because your business data doesn’t exist anymore! Information about thousands of customers, products, sales orders, inventory plans, pricing sheets, contracts, and a lot more – not accessible anymore. What do you do? Well, if your enterprise has been following data backup best practices, you’ll just smile, and check what the progress on the data restoration is. Alas, problems await. That’s because your people might have committed one of the commonplace yet breakneck mistakes of data backups. Read on to find out.


Fixation of the Act of Backup

Sounds weird, but that’s what most enterprises do, really. Data engineers, security experts, and project managers – everyone is so focused on the act of backup, that they all lose track of the eventual goals of the activity. Recovery time objectives (RTO) and recovery point objectives (RPO) should govern every act in the process of data backup. Instead, companies only focus on ensuring that data from every important source is included in the backup.

Nobody, however, pays much heed to backup testing. This, for instance, is one of the key aspects of making your data backup process foolproof. Instead, companies end up facing a need for data restoration, only to realize that the backup file s corrupt, missing, or not compliant with the pre-requisites of the restoration tool.

The solution – make rigorous backup testing a key element of your backup process. There are tools that execute backup tests in tandem with your data backup. If you don’t wish to invest in such tools as yet, make sure you conduct backup testing at least bi-annually.

Not Adopting Data Backup Technologies

What used to be a tedious and strenuous task for administrators and security experts a few years back can now be easily automated using data backup tools. These tools are much more reliable than manual backup operations. What’s more, there will not be the dreaded problems such as those associated with data formats, etc., when the time for restore arrives.

Scheduled backups, simultaneous testing, and execution of backup and restore in sync with your RTO and RPO goals. Of course, businesses must understand the data backup tools available in the market before choosing one.


Unclear Business Requirements (In Terms Of Data Backup And Restore)

Take it from us; one size won’t fit all organizations or processes, when it comes to data backups, whether manual or controlled via a tool. Project managers must understand the business requirements around data to be able to plan their data backup projects well. The backbone of a successful data backup process and plan is a document called recovery catalog. This document captures all necessary details centered on aspects such as:

The different formats of data owned by the business

  • The time for which every backup needs to be available for possible restore operations (RPO)
  • The priority of different data blocks from a recovery perspective (RTO)
  • The recovery document will go a long way in helping you enlist the tools you need for successful management of data backup and recovery. Also, it will help you design better processes and improve existing processes related to the entire lifecycle of data backup.

Right Requirement, Wrong Tool

Your CIOs expectations from your team are governed by the business’ expectations from the entire IT department of the company. There’s nothing wrong with the expectations and requirements, it’s possible, however, that the tools you have are not well suited to fulfill those requirements.

For instance, in an IT ecosystem heavily reliant on virtualization, there are already built in cloning capabilities within these virtualization tools. However, these backups can take disk space almost equal to the entire environment. Now if you need to change your VMs often, your storage will soon be exhausted as you keep on making new copies of updated environments.

If you have clarity on the most important business applications, it becomes easier to work with IT vendors and shortlist data backup tools that can easily integrate with these applications. This could be a massive boost to your enterprise’s data backup capabilities.

Failure to Estimate Future Storage Needs

No doubts, the costs of data storage are on their way down, and chances are they’ll continue to do so. However, almost every business only buys storage based on its estimation of what’s needed. It’s commonplace enough for companies to completely ignore the fact that their data backups will also need space to stay safe. And this is why it’s so important to estimate the data storage requirements after accounting for your data backup objectives. While doing a manual backup, for instance, if the executors realize that there’s not much space to play around with, it’s natural for them to leave out important data. Also, account for the possibilities of increased frequencies of backups in the near future.

Not Balancing Costs of Backup with Suitability of Media

It’s a tough decision, really, to choose between tape and disks for data storage. While tapes are available inexpensively, in plenty, and pretty durable from a maintenance perspective, you can’t really store essentials systems data and business critical applications’ data on tape, because the backups are slow. Estimate the cost of time lost in the slow backup because of tapes while deciding on your storage media options. Often, the best option is to store old and secondary data on tape and use disks for storage of more important data. In this case, you will be able to execute data restoration and complete is sooner than depending purely on tape media.

Concluding Remarks

There’s a lot that can go wrong with data backups. You could lose your backed-up data, run out of space for it, realize the data backup files are corrupted when you try to restore them, and in general, fail to meet the RTO and RPO goals. To do better, understand what leads to these mistakes, and invest time and money in careful planning to stay secure.


Author: Rahul Sharma

Personal Data Breach Response Under GDPR

personal data breach

Data security is at the heart of the upcoming General Data Protection Regulation (GDPR). It sets strict obligations on data controllers and processors in matters pertaining data security while concurrently providing guidance on the best data security practices. And for the first time, the GDPR will introduce specific breach notification guidelines. With only a few months to go until the new regulations come into effect, businesses should begin focusing on data security. Not just because of the costs and reputational damage a personal data breach can lead to; but also because under the GDPR, a new data breach notification regime will be applied to statute the reporting of certain data breaches to affected individuals and data protection authorities.

What Constitutes a Personal Data Breach Under GDPR?

GDPR describes A personal data breach as – a security breach that leads to the unlawful or accidental loss, destruction, alteration, or unauthorized disclosure of personal data stored, processed or transmitted. A personal data breach is by all means a security incident; however, not all security incidents require the same strict reporting regulations as a personal data breach. Despite the broad definition, it is not unusual in data security laws that require breach reporting. HIPAA, for example, makes the same distinctions at the federal level for medical data. It aims to prevent data protection regulators from being overwhelmed with breach reports.

By limiting breach notifications to personal data (EU speak for personally identifiable information – PII), incidents that solely involve the loss of company data/ intellectual property will not have to be reported. The threshold to establish whether an incident has to be reported to a data protection authority is dependent on the risk it poses to the individuals involved. High risk situations are those that can potentially lead to the significant detrimental suffering – for example, financial loss, discrimination, damage to reputation or any other significant social or economic disadvantage.

…it should be quickly established whether a personal data breach has occurred and to promptly notify the supervisory authority and the data subject.

– Recital 87, GDPR

If an organization is uncertain about who has been affected, the data protection authority can advise and, in certain situations, instruct them to immediately contact the individuals affected is the security breach is deemed to be high risk.

What Does The GDPR Require You to Do?

Under GDPR, the roles and responsibilities of processors and data controllers have been separated. Controllers are obliged to only engage processors who are capable of providing sufficient assurances to implement appropriate organizational and technical measures to protect the rights of data subjects. In the event of a data breach that affects the rights and freedoms of said data subjects, the organization should report it, without any delay and, where practicable, within 72 hours of becoming aware of it.

The data processor is mandated to notify the controller the moment a breach is discovered, but has no other reporting or notification obligation under the GDPR. However, the 72-hour deadline begins the moment the processor becomes aware of the data breach, not when the controller is notified of the breach. A breach notification to a data protection authority has to at least:

  1. Have a description of the nature of the breach, which includes the categories and number of data subjects affected.
  2. Contain the data protection officer’s (DPO) contact information.
  3. Have a description of the possible ramifications of the breach.
  4. Have a description of steps the controller will take to mitigate the effect of the breach.

The information can be provided in phases if it is not available all at once.
If the controller determines that the personal data breach can potentially put the right and freedoms of individuals at risk, it has to communicate any information regarding the breach to the data subjects without undue delay. The communication should plainly and clearly describe the nature of the personal data breach and at least:

  1. Contain the DPO’s contact details or a relevant contact point.
  2. Have a description of the possible ramifications of the breach.
  3. Have a description of measures proposed or taken to mitigate or address the effects of the breach.

The only exception in this case is if the personal data has been encrypted, and the decryption key has not been compromised, then there is not need for the controller to notify the data subject.

The most ideal way for companies to handle this GDPR obligation is to not only minimize breaches, but also, establish policies that facilitate risk assessment and demonstrates compliance.

The GDPR stipulates that all the records pertaining the personal data breach, regardless of whether the breach needs to be reported or not. Said records have to contain the details of the breach, any consequences and effects, and the follow up actions taken to remedy the situation.

Should Ransomware Attacks Be Reported?

Ransomware typically involves the ‘hijacking’ of cooperate data via encryption and payment is demanded in order to decrypt the ransomed data. Under GDPR, Ransomware attacks may be categorized as a security incident but it does not necessarily cross the threshold of a personal data breach. A Ransomware attack would only be considered a personal data breach if there is a back up but the outage directly impacts user’s freedoms and rights, or if there is no back up at all. Ideally, a Ransomware attack where the ransomed data can be quickly recovered does not have to be reported.

What Are the Consequences of Non-Compliance?

A failure to comply with the GDPR’s breach reporting requirements will not only result in negative PR, constant scrutiny, and possibly loss of business; but will also attract an administrative fine of up to € 10 million or up to two percent of the total global annual turnover of the preceding financial year. Additionally, failure to to notify the supervising authority may be indicative of systematic security failures. The would show an additional breach of GDPR and attract more fines. The GDPR does have a list of factors the supervising authority should consider when imposing fine; chief among them being the degree of co-operation by the data controller with protection authority.

In Closing

Data breach notification laws have already been firmly established in the U.S. These laws are designed to push organizations to improve their efforts in the detection and deterrence of data breaches. The regulators intentions are not to punish but to establish a trustful business environment by equipping organizations to deal with with security issues.

Author: Gabriel Lando

image courtesy of freepik

FileCloud Unveils Enterprise Edition, Deploys Secure Collaboration and Storage for Large Organizations

  • Supports increased need for robust cybersecurity protection and compliance measures
  • Purpose built for organizations with 1000+ users
  • Allows businesses to keep their data on infrastructure of choice including public, private and hybrid clouds

FileCloud, a cloud-agnostic Enterprise File Sharing and Sync (EFSS) platform, today announced the release of FileCloud Enterprise Edition. Designed to enable IT administration, management and compliance across enterprise-level systems with over 1000 users, FileCloud Enterprise Edition simplifies data security in an increasingly cloud-based business environment.

“Managing secure collaborations across enterprise environments is critical and keeps many CIOs awake at night,” said Madhan Kanagavel, CEO of FileCloud. “ With just a few clicks, FileCloud’s Enterprise Edition helps IT administrators configure user settings, integrate branch office file servers, manage policies and deploy apps across any large organization. Innovations like these are the reasons why enterprises prefer FileCloud over other consumer-oriented
collaboration solutions.”

FileCloud solves the challenge of losing control over intellectual property/information assets. Unlike other centralized file Software-as-a-Service (SaaS) offerings, FileCloud gives complete control over their data by allowing businesses to keep their data on any infrastructure of their choice including public, private and hybrid clouds. Mobile apps increase productivity and flexibility without needing to worry about data integrity. Shared documents synchronize and can be locked or shared with expiration dates and users can access remotely the same drives that are available in the office.

Enterprise Edition features and services include:

  • Mass Deployment: Deploy and configure a fleet of end user computing devices (desktop, mobile devices, and file servers) in a few click from a centralized management dashboard.
  • Remote Health Monitoring: Includes real-time monitoring of employee devices and actions with a detailed audit trail and delegation capabilities to prevent data loss and detect security threats.
  • Compliance: Enforce policies and regulatory requirements (supports GDPR, HIPPA and FINRA compliance) across employees. Also offers Federated search and eDiscovery capabilities to find sensitive data across the user base.
  • Professional Services: Offers a wide range of technical assistance in implementing large deployments successfully including, deep technical help in designing high availability, branch office integrations, clustering and multi-cloud deployments. Services also can help in configuring Single Sign-On and integrating with other systems like ActiveDirectory.

Penta, a global IT services company with offices in Switzerland, UAE, and Japan, mainly servicing financial institutions has deployed FileCloud to solve a number of business challenges. “One of our biggest challenges as an IT service company is to set up and manage corporate file sharing securely across hundreds of file servers, computers, and mobile phones,” said Shadi Jaber, IT Manager at Penta. “FileCloud has the right features and toolset that makes this easy.”

Many large organizations including Fiserv, NASA, Swiss Federal Institute of Intellectual Property, and the City of San Diego use FileCloud for enterprise file sharing and collaboration. Try for free.

Data Security Questions Every Enterprise Should Ask

Over the past decade, cloud computing has transitioned from being a buzzword to becoming a staple technology for most enterprises, mainly driven by cloud’s accessibility, superior flexibility, and capacity compared to mainstream computing and storage techniques. However, just like mainstream data sharing and storage methods, cloud computing does not lack its fair share of data security issues. Palliating data security risks is essential to creating a level of comfort amongst CIOs, to migrate data applications to the cloud. The decision to transition to the cloud has to be dependent on how sensitive the data is and the security guarantees the cloud vendor provides.

Is your data safe in the hands of a cloud service provider?

In today’s exceedingly mobile world, enterprises are heavily relying on cloud vendors, and allowing remote access to more devices than ever before. The end result is a complex network that requires higher levels of security. The only way organizations can maintain the availability, integrity, and confidentiality of these different applications and datasets is by ensuring their security controls and detection-based tools have been updated to work with the cloud computing model. Whenever data is stored in the cloud, the main point of focus is typically the security of the cloud provider and hosting facility. However, this focus is usually at the expense of how the data itself is handled. This begs the question, do you trust the cloud vendor’s technology? Do you trust their employees? Do you trust their safeguards? Are you completely sure that if their back was against the wall they would not sell or compromise any of your data?

The fact of the matter remains that, once you move your data to a public cloud platform, you can no longer exercise your own security controls. Outsourcing also introduces a costly threat to intellectual property in the form of digital information like engineering drawings, source code, etc. An organization has to give its cloud service provider access to important IP assets, which are vital to the organization’s core business. Exposing invaluable information to third parties presents an epoch-making security risk. In most cases, migrating to the cloud means you have no option but to trust the vigilance, knowledge, and judgment of your chosen vendor.

As cloud-based solutions like Dropbox and Google Drive become more popular within the business setting; enterprises have to come to grips with the fact that issues like loss of control over confidential data are a looming security threat. Despite the fact that cloud vendors implement several security measures to isolate tenant environments, the organization still loses some level of IT control, which equates to risk as sensitive data and applications no longer reside within a private, physically isolated data-center. Is the business value worth the risk?

Why is Metadata Security Important?

In a nutshell, metadata is data about data. The bigger question is whether or not metadata is personally identifiable. If enough of it is linked together, a detailed profile of an individual or organization can be created; enough to personally identify them. Most IT security experts agree that metadata typically contains sensitive information, hidden from obvious view, but easily extractable. Metadata poses a great data leak risk since employees are not even aware of its existence. Whenever a request is made to store or retrieve data from a cloud storage server, the request and subsequent response contain metadata about both the request and the data itself. Since the organization has little to no control of this metadata, there is no way to guarantee its security.

What Happens in the event of a data breach?

As cloud adoption rates increase, cloud providers are increasingly becoming attractive targets for cybercriminals because of the huge amounts of data stored on their servers. Access to unencrypted metadata is enough to count as a full-fledged breach. The severity of a data breach is dependent on the sensitivity of the data being exposed. Breaches that involve trade secrets, health information and intellectual property are usually the most direful. It is worth noting that cloud vendors are not subject to similar data breach disclosure laws as federal agencies, banks, and other entities. So if a breach does occur, it may never be publicized or associated with the vendor.

Despite numerous efforts from public cloud providers to implement stringent security measures to curb the risk of data breaches; the burden of responsibility for data security ultimately falls on the organization and a breach will have critical financial and legal consequences.

Who Controls Your Data?

Ensuring that the data and applications residing in the cloud are kept safe is becoming more crucial as high-value data, mission-critical applications and intellectual property is transferred to the cloud. Despite the fact that cloud computing, in general, can be perceived as less secure, the fear of cloud security is situational. The real conundrum shouldn’t be whether or not to migrate to the cloud, but which cloud to migrate to. From a security standpoint, most cloud service providers are not ready. Using unsecured cloud vendors can expose sensitive cooperate data without your organization even realizing it. Enterprises commercially and legally have to maintain control over their data while customers and employees need to be able to freely collaborate, share and sync files they require. The solution is simple! Private Cloud.

Private Cloud Offers a Better Alternative

A private cloud computing model facilitates control and collaboration while protecting confidential data from unauthorized access. IT stakeholders need to have a detailed understanding of where and how data is being stored and transferred. With a self-hosted cloud deployment for critical data, you have maximum control, integration, and configuration of all the layers of security.

  • Flexible Infrastructure

A cloud deployment is considered private when it is hosted on the organization’s servers. However, that does not necessarily mean the servers are hosted on-premises. By going the self-hosted route, companies are able to choose whether they want to house their files on-premises or in a remote data center. Despite the fact that on-premises infrastructure has the added advantage of more control and ownership, you will also be responsible for capacity planning. Given the costs associated with operating a data center and the redundancy required to operate at 100 percent network and power uptime; organizations can opt to leverage a hosted private cloud in the form of Infrastructure as a Service (IaaS) or Platform as a Service (PaaS).

This model allows the organization to have a scalable, isolated computing environment that has been custom-designed to meet its specific workload requirements, with the jurisdiction of their choice. A good example is AWS’ VPC which provides cloud hosting capabilities with enterprise-grade IT infrastructure through a virtualized network of interconnected virtual servers. GovCloud also allows US government agencies to host private clouds in secure regions operated by U.S citizens, and is only accessible to vetted U.S entities.

In a nutshell, a private cloud allows organizations to develop a flexible infrastructure to deliver applications while retaining control and managing the risk of the services delivered to business partners, users, and customers.

  • Maximum Control

A private cloud deployment gives you control over security, privacy, and compliance. You can manage all your applications, IT services, and the infrastructure in one place using powerful tools like application and performance monitoring, VM templates, and automated self-service deployment. Since you have the control from the ground up, you will not be forced to adjust your security processes to meet those of the cloud; instead, you will bend the cloud to your will. A self-hosted cloud lets you leverage your current security infrastructure and procedures and easily integrates with existing tools. It simply works within your set framework; and when your data requirements scale, you will have the ability to scale with them.

The physical location of the data-center plays a crucial role in cloud adoption. A private cloud creates the opportunity to choose the region data will be stored. By having control over your selection of hosting provider/ data center, you know precisely where your servers are located, and under which nation’s data laws they are governed. Organizations may be obliged or simply prefer, to store data in a jurisdiction or country that is not offered by a public cloud provider.

In Closing

A private cloud expands visibility into workloads and cloud operations. Thus enabling IT administrators to design data storage, hardware, and networks in a way that guarantees the security of data and associated metadata. When IT is fully aware of where the data is located and who has access to it at any given moment in time; the risks of compliance violations, data security vulnerabilities, and data leakage are thwarted.

Author: Gabriel Lando

Data Owner Responsibilities When Migrating to an EFSS

While it is easy to say and conclude that all data belongs to your organization, complications arise when the person accountable for data ownership has to be identified. Even when the IT department spearheads the process of processing, storing, and backing up data among other functions, the fact is that it does not own business data. Worse still, outsourced service providers do not own this data any more than the IT department does.

Who Owns Data? What are Data Owner’s Responsibilities?

In the cloud environment, a data owner is a business user who understands the business impact of a security breach that would lead to loss of data, integrity, and confidentiality. This responsibility makes the data owner very conscious of decisions made to mitigate and prevent such security incidents.

When migrating to an EFSS, business data owners should do the following:

Classify Data

Data classification has been extensively labeled as a remedy to data breaches. In essence, data classification helps to significantly reduce insider threats, which are reported to cause 43% of data breaches. Other than malicious employees, data breaches are a result of human error. Additionally, the growing data volume experienced by businesses makes it difficult to track data; hence, it is challenging to know where data is stored, who accesses it, and what they do with this information. By making sure that only authorized employees access certain information, the probability of a data breach is likely to reduce.

Clearly, the need for data classification has never been more evident. To properly classify data, a few measures should be taken by a business.

  • Have detailed “acceptable use” policies. All employees should internalize and sign these documents, which are then reviewed annually or as needed.
  • Make use of data classification technologies. When you train employees using a data classification technology, they will better understand the sensitivity of the data they are creating, storing, and sharing. Consequently, they will treat this data with the highest level of confidentiality and caution.
  • Understand industry regulations to classify data accordingly.
  • Once data is properly classified, apply appropriate access controls and continuously yet randomly monitor data activity to nab suspicious activities as soon as they are carried out.

Monitor Data Life Cycle Activities

When migrating to an EFSS, issues such as data retention and disposal should constantly be monitored by a business data owner. Simply put, how long will the EFSS solution retain your data and how long will it take to dispose of your data completely after you have deleted it? What happens to your data once your contract with the provider ends?

Before a business owner looks at an EFSS provider’s life cycle, he needs to understand the typical seven phases of data life cycle. From the first stage of data capture, data maintenance, data synthesis, data usage, data publication, or data archival to data purging, how safe is it? Who has access to it and how long is it retained in the EFSS?

When this data is stored, is it used and accessed by third-parties who, sadly, cause 63% of all data breaches? Is the EFSS data retention and disposal policy compliant with the law? For example, data retention requirements stipulated in the Health Insurance and Portability and Accountability Act (HIPAA) state that organizations that accept credit cards must adhere to a Payment Card Industry Data Security Standard (PCI DSS) data retention and disposal policy.

Understand Enterprise File Sync-and-Share (EFSS) Deployment Models, As a Way of Assessing Risks

Despite the existence of extensive advice on the best EFSS solutions that exist, business data owners need to gain some technical knowledge. How many EFSS deployment models do you know, for example? Since this is a pretty broad topic, we will briefly discuss three models.

Public Cloud EFSS

In addition to being fast and easy to set up, a public cloud could be cheaper in terms of both infrastructure and storage costs. However, public cloud EFSS might not be the best regarding data protection and security, leaving your company exposed and vulnerable to regulatory non-compliance. It is, therefore, important to analyze the security measures the public cloud has to offer before settling.

Private Cloud EFSS

Although private cloud is believed to be more expensive compared to the public cloud, the cost of ownership depends largely on the vendor and infrastructure choice (for example, FileCloud offers the lowest cost of ownership across public and private clouds). Private cloud EFSS is worthwhile regarding services and security offered. With an adoption rate of 77%, private cloud solutions such as FileCloud are better options. This opinion is attributed to the flexibility and control over where data is stored. Consequently, users can choose which regulations to comply with and have better control over a breach because the IT department can access all the files and monitor, protect, and salvage them, as opposed to a public cloud.

Hybrid Cloud EFSS

According to RightScale’s, “Cloud Computing Trends: 2016 State of the Cloud Survey,” hybrid cloud EFSS adoption rate is 71%. The success is believed to be the result of the ability to harness the positive attributes of both a public and private cloud all at once because, usually in a hybrid environment, some components will run on the premises while others run in the cloud. One great example of a hybrid model is an EFSS application that runs as Software as a Service (SaaS) while data is stored on the premises or at the discretion of the user company.

Closing remarks

It is the responsibility of a business data owner to ascertain that data will be kept safe and confidential before migrating to any EFSS solution. This person needs to be savvy with the advantages a chosen EFSS model offers, compliance with industry regulations, proper access and identity management, understand the EFSS data life cycle processes, and ensure that security measures such as data encryption and authentication processes are in place.

 Author: Davis Porter


Launching FileCloud 12 – Branch Office File Sharing, Full Text Search, Mobile Offline Sync and much more …

We are  happy to announce that FileCloud 12.0 is now available for general availability. Like a chocolate box, FileCloud 12 brings pleasant surprises for every one of our customers and target market segments. It is probably one of the best releases delivered by our engineering team. We have addressed hard engineering problems in the following areas: remote branch office file sharing, full text search and mobile offline sync.

It is our goal to deliver a truly innovative EFSS solution that addresses practical pain points in managing and sharing enterprise information. With FileCloud 12.0, we have taken a great leap forward to achieve that goal.

Here is the summary of major capabilities offered by FileCloud 12.

FileCloud ServerLink – Remote Office and Branch Office File Sharing

FileCloud ServerLink is an industry first, true remote office and branch office solution that addresses the latency and high availability requirements of organizations that have sites across countries and remote locations. ServerLink is a FileCloud add-on that seamlessly replicates one FileCloud site to another site in different location.

FileCloud ServerLink, Branch Office Access

Following are some of the practical scenarios where ServerLink can help greatly in sharing organization information,

  1. Companies in the construction sector often have sites in remote locations. Using ServerLink, they can replicate the files from HQ to local job sites to avoid latency and give faster access to employees who work in remote locations.
  2. Multinational companies and organizations who have offices in different countries and continents can deploy ServerLink to their branch office to reduce latency and get faster access to files.
  3. Oil and Gas companies typically deal with large data files and have job sites in remote locations. They can benefit greatly from deploying ServerLink to their job sites.
  4. Media firms with offices across countries can deploy ServerLink in each of their branch offices to get faster access to large media files.

The above mentioned scenarios are just a tip of the iceberg. Every market segment can get a true value addition by deploying ServerLink.  ServerLink is currently a Beta functionality and will be priced separately from FileCloud.

ServerLink-Branch Office File Sharing

Full Text Content Search

FileCloud 12.0 brings full text search to our customers. Now customers can search files not only based on  file name, extension and also based on the content inside the document. Full text search is supported both on managed and network shares. The content search is supported for the following file formats: txt, pdf, doc, docx, xlsx, ppt and pptx.  In the future releases, we will further augment the file search capabilities to support data leak prevention, e-discovery and federated search capabilities.


Office add-in for Word, Excel and Powerpoint

We are launching a new FileCloud office add-in that will enable users to open/edit/save files to FileCloud server directly from Microsoft word, powerpoint and excel apps on a PC. Now users can work directly from their favorite office apps and save the files to FileCloud. Moving forward, we continue to provide first class integration with Microsoft Office productivity suite on both desktop and mobile.



Mobile Offline Sync

FileCloud 12 brings mobile offline sync to iOS devices. Now users can sync folders (two way or one way) directly from the FileCloud server. It is great for users who work in remote job sites or traveling sales people who can update the documents in their iPads without requiring an internet connection. Mobile Offline sync will also come to Android devices in near future


Mobile 2FA support (Google Authenticator)

With FileCloud 12.0 onwards, we also support Google Authenticator app for two factor authentication. One can download the Google Authenticator  mobile applications from respective app stores. The user will be required to setup the Google Authenticator once and then subsequently will need to provide the code generated by the Google Authenticator app inorder to login.

FileCloud 2FA Google Authenticator

FileCloud 12 also brings 2FA support to FileCloud admin accounts.

Faster access to Network shares protected by NTFS file permissions

One of the unique selling proposition of FileCloud is that it supports your existing network shares and NTFS file permissions seamlessly.  With FileCloud 12, we have further improved the access performance of listing network shares with large number of files. This will require the updating of web server components. When it comes to enabling web, desktop and mobile access to your network shares we can confidently say FileCloud is the best solution out in the market.

Further FileCloud brings hundreds of features, incremental improvements and bug fixes. For the complete list of new improvements please check the FileCloud 12 release notes here.

FileCloud is currently used by thousands of organizations across 55 countries including world’s leading space agency, 5 of top 500 legal firms, largest title insurance company, Americas leading home builder to the third largest poultry producer. We are incredibly thankful to our customers who have a complete faith and trust in our offerings.

Innovation never stops and it is a continuous process. Our strategy  is to out innovate our competitors and lead the Enterprise File Sharing and Sync market segment.  We promise our customers that we will hyper focus our efforts  in creating the best, innovative Enterprise File Sharing and Sync Solution in the market.

Alternative to WatchDox – Why FileCloud is better for Business File Sharing?


FileCloud competes with WatchDox for business in the Enterprise File Sync and Share space(EFSS). Before we get into the details, I believe an ideal EFSS system should work across all the popular desktop OSes (Windows, Mac and Linux) and offer native mobile applications for iOS, Android, Blackberry and Windows Phone. In addition, the system should offer all the basics expected out of EFSS: Unlimited File Versioning, Remote Wipe, Audit Logs, Desktop Sync Client, Desktop Map Drive and User Management.

The feature comparisons are as follows:

Features WatchDox
On Premise
File Sharing
Access and Monitoring Controls
Secure Access
Document Preview
Document Edit
Outlook Integration
Role Based Administration
Data Loss Prevention
Endpoint Backup
Amazon S3/OpenStack Support
Public File Sharing
Customization, Branding
SAML Integration
NTFS Support
Active Directory/LDAP Support
API Support
Application Integration via API
Large File Support
Network Share Support Buy Additional Product
Mobile Device Management
Desktop Sync Windows, Mac, Linux Windows, Mac
Native Mobile Apps iOS, Android, Windows Phone iOS, Android
Encryption at Rest
Two-Factor Authentication
File Locking
Pricing for 20 users/ year $999 $3600

From outside looking-in, the offerings all look similar. However, the approach to the solution is completely different in satisfying enterprises primary need of easy access to their files without compromising privacy, security and control. The fundamental areas of difference are as follows:

Feature benefits of FileCloud over WatchDox

Unified Device Management Console – FileCloud’s unified device management console provides simplified access to managing mobile devices enabled to access enterprise data, irrespective of whether the device is enterprise owned, employee owned, mobile platform or device type. Manage and control of thousands of iOS and Android, devices in FileCloud’s secure, browser-based dashboard. FileCloud’s administrator console is intuitive and requires no training or dedicated staff. FileCloud’s MDM works on any vendor’s network — even if the managed devices are on the road, at a café, or used at home.

Amazon S3/OpenStack Support Enterprise wanting to use Amazon S3 or OpenStack storage can easily set it up with FileCloud. This feature not only provides enterprise with flexibility to switch storage but also make switch very easily.

Embedded File Upload Website Form – FileCloud’s Embedded File Upload Website Form enables users to embed a small FileCloud interface onto any website, blog, social networking service, intranet, or any public URL that supports HTML embed code. Using the Embedded File Upload Website Form, you can easily allow file uploads to a specific folder within your account. This feature is similar to File Drop Box that allows your customers or associates to send any type of file without requiring them to log in or to create an account.

Multi-Tenancy Support – The multi-tenancy feature allows Managed Service Providers(MSP) serve multiple customers using single instance of FileCloud. The key value proposition of FileCloud multi-tenant architecture is that while providing multi-tenancy the data separation among different tenants is also maintained . Moreover, every tenant has the flexibility for customized branding.

NTFS Shares Support – Many organizations use the NTFS permissions to manage and control the access permissions for internal file shares. It is very hard to duplicate the access permissions to other systems and keep it sync. FileCloud enables access to internal file shares via web and mobile while honoring the existing NTFS file permissions. This functionality is a great time saver for system administrators and provides a single point of management.


Based on our experience, enterprises that look for an EFSS solution want two main things. One, easy integration to their existing storage system without any disruption to access permissions or network home folders. Two, ability to easily expand integration into highly available storage systems such as OpenStack or Amazon S3.

WatchDox neither provides OpenStack/Amazon S3 storage integration support nor NTFS share support. On the other hand, FileCloud provides easy integration support into Amazon S3/OpenStack and honors NTFS permissions on local storage.

With FileCloud, enterprises get one simple solution with all features bundled. For the same 20 user package, the cost is $999/year, almost 1/4th of WatchDox.

Here’s a comprehensive comparison that shows why FileCloud stands out as the best EFSS solution.

Try FileCloud For Free & Receive 5% Discount

Take a tour of FileCloud