Archive for the ‘Cloud Computing’ Category

The Intelligent Cloud : Artificial intelligence (AI) Meets Cloud Computing

intelligent cloud

If you thought that mobile communications and the Internet have drastically changed the world, just wait. Coming years will prove to be even more disruptive and mind-blowing.  Over the last few years, cloud computing has been lauded as the next big disruption in technology and true to the fact it has become a mainstream element of modern software solutions just as common as databases or websites; but is there a next phase for cloud computing? is it an intelligent cloud?

Artificial intelligence (AI) is the type of technology with the capacity to not only enhance current cloud platform incumbents but also power an entirely new generation of cloud computing technologies. AI is moving beyond simple chat applications like scheduling support and customer service, to impact the enterprise in more profound ways; as automation and intelligent systems further develop to serve the purpose of critical enterprise functions. AI is bound to become ubiquitous in every industry where decision-making is being fundamentally transformed by ‘Thinking Machines’. The need for smarter and faster decision making and the management of big data is the driving factor behind the trend.

Remember Moore’s Law? In 1965, Intel’s co-founder, Gordon Moore observed that the transistors per square inch on integrated circuits had doubled in number each year since their invention. For the next 50 years, Moore’s Law was maintained. In the process, multiple sectors like robotics and biotechnology saw remarkable innovation because machines that ran on computers and computing power all became faster and smaller with time as the transistors on the integrated circuits became more efficient. Now, something even more extraordinary is happening. Accelerating technologies such as big data and artificial intelligence are converging to trigger the next major wave of change. This ‘digital transformation’ will reshape every aspect of the enterprise, including cloud computing.

Artificial intelligence (AI) is expected to burgeon in the enterprise in 2017. Several IT players, including today’s top IT companies, have heavily invested in the space with plans to increase efforts in the foreseeable future.

Despite the fact that AI has been around since the 60’s, advances in networking and graphic processing units, along with demand for big data, have put it back at the forefront of several companies’ minds and strategies. Given the recent explosion of data from Internet of Thing (IoT) and applications, and the necessity for quicker, real-time decision making, AI is well on its way to becoming a key differentiator and requirement for major cloud providers.

AI-First Enterprises

In a market that has for the longest time been dominated by four major companies – IBM, Amazon, Microsoft, and Google –an AI first approach has the potential to disrupt the current dynamic.

“I think we will evolve in computing from a mobile-first to an AI-first world.”

-Sundar Pichai, Chief executive of Google

The consumer world is not new to AI-based systems; products like Siri, Cortana and Alexa have been making our lives easier for a while now. However, the enterprise applications for AI are completely different. An AI first enterprise approach should be designed to allow business leaders and data professionals to organize, collect, secure and govern data efficiently so they can gain the insights they require to become a cognitive business. In order to maintain a competitive advantage, businesses today have to get insights from data; however, acquiring those insights is complex and requires work from skilled data scientists. The ability to predict strategic and tactical purposes has evaded enterprises due to prohibitive resource requirements.

Cloud computing solves the two largest hurdles for AI in the enterprise; abundant, low cost computing and a means to leverage large volumes of data.

AI-as-a-service?

Today, this new breed of Platform as a Service (AIaaS) can be applied on all the data that enterprises have been collecting. Major cloud providers are making AI more accessible “as-a-service” via open source platforms. For enterprises with an array of complex issues to solve, the need for disparate platforms working together can’t be ignored. This is why making machine learning and other variations of AI applications and technology available via open source is critical to the enterprise. By leveraging AI-as-a-service, businesses can innovate solutions that solve infinite problems.

As machine learning becomes more popular as a service, organizations will have to decide the level at which they want to be involved. While the power of cognitive intelligence is undeniably high, wanting to use it and being able to use it are two completely different things. For this reason, most companies will opt to use a PaaS vendor to manage their entire cycle of data intelligence as opposed to an in-house attempt, allowing them to focus on powering and developing their applications. When looking for an AI provider, you have to ask the right questions. The ideal vendor should be in a position to elucidate both how they handle data and how they intend to solve your specific enterprise problem.

There are multiple digital trends that have the potential to be disruptive; the only way to guarantee smarter business processes, more agility, and increased productivity is by planning ahead for the change and impact that is coming. The main differentiating factor between competing vendors in this space will be how the technology is applied to improve business processes and strategies.

Author: Gabriel Lando

Image Courtesy: pexels.com

Data ownership in the cloud – How does it affect you?

The future of the cloud seems bright, Cisco predicts that by 2018, 59% of cloud workloads will be created from Software As A Service (SaaS). While these statistics are optimistic, we cannot ignore a few concerns that stifle cloud adoption efforts, such as data ownership.

Most people would be inclined to say that they still own data in the cloud. While they may be right in some sense, this is not always the case. For instance, let us look at Facebook, which many people use as cloud storage to keep their photos. According to the Facebook end-user-agreement, the company stores data for as long as it is necessary, which might not be as long as users want. This sadly means that users lose data ownership. Worse still, the servers are located in different locations, in and out of the United States, subjecting data to different laws.

According to Dan Gray, as discussed in ‘Data Ownership In The Cloud,’ the actual ownership of data in the cloud may be dependent on the nature of the data owned and where it was created. He states that there is data created by a user before uploading to the cloud, and data created on the cloud platform. He continues to say that data created prior to cloud upload may be subject to copyright laws depending on the provider, while that created on the platform could have complicated ownership.

In addition to cloud provider policies, certain Acts of Congress, although created to enhance data security and still uphold the nation’s security, have shown how data ownership issues affect businesses. Two of these, the Stored Communications Act (SCA) and the Patriot Act show the challenges of cloud data ownership and privacy issues, with regards to government access to information stored in the cloud.

The Stored Communications Act (SCA)

Usually, when data resides in a cloud provider’s infrastructure, user owner rights cannot be guaranteed. And even when users are assured that they own their data, it does not necessarily mean that the information stored there is private. For example, the United States law, through the Stored Communications Act (SCA), gives the government the right to seize data stored by an American company even if it is hosted elsewhere. The interpretation of this saw Microsoft and other technology giants take the government to court, claiming that it was illegal to use the SCA to obtain a search warrant to peruse and seize data stored beyond the territorial boundaries of the United States.

Microsoft suffered a blow when a district court judge in New York ruled that the U.S government search powers extend to data stored in foreign servers. Fortunately, these companies got a reprieve mid-2016, when the Second Circuit ruled that a federal court may not issue a criminal warrant to order a U.S cloud provider to produce data held in servers in Ireland.  It is however, important to note that this ruling only focused on whether Congress intended for the SCA to apply to data held beyond U.S.A territory, and did not touch on issues to deal with Irish data privacy law.

The Patriot Act

The Patriot Act was put into place in 2001 as an effort by George Bush government to fight terrorism. This act allowed the Federal Bureau of Investigation (FBI) to search telephone, e-mail, and financial records without a court order, as well as expanded law enforcement agencies access to business records, among other provisions. Although many provisions of this Act were set to sunset 4 years later, the contrary happened. Fast-tracking to 2011, President Barrack Obama signed a 4-year extension of 3 key provisions in the Act, which expanded the discovery mechanisms law enforcement would use to gain third-party access. This progress brought about international uproar especially from the European Union, causing the Obama administration to hold a press conference to quell these concerns.

The situation was aggravated when a Microsoft UK director admitted that the Patriot Act could access EU based data, further disclosing that no cloud service was safe from the ACT, and the company could be forced to hand over data to the U.S government. While these provisions expired on June 1 2015, due to lack of congressional approval to renew, the government found a way to renew them through the USA freedom Act.

The two Acts show us that data owned in the cloud, especially public cloud, is usually owned by the cloud providers. This is why we are seeing the laws asking cloud providers to provide this information, and not cloud users.

What To Do In Light Of These Regulations

Even if the SCA has been ruled illegal as not to be used to get warrants to retrieve data stored in the cloud, and the USA freedom Act is purported by some parties as a better version of the Patriot Act, we cannot ignore the need for cloud users to find a way to avoid such compulsions.

One idea users could have is escaping the grasp of these laws, which is unfortunately impractical. To completely outrun the government, you would have to make sure that neither you nor the cloud service used has operations in the United States. This is a great disadvantage because most globally competitive cloud providers are within the United States jurisdiction. Even when you are lucky and find a suitable cloud provider, it is may still be subject to a Mutual Legal Assistance Treaty (MLAT) request. Simply, put, there is no easy way out.

Instead, understand the risks and let your clients know. For example, if the Patriot Act extension attempts were successful, financial institutions would be obliged to share information with law enforcement agencies on suspicion of terrorist activities. In such a case, a good financial institution would warn its clients of these risks before hand. Alternatively, you can find a way of storing data in-house, forcing the feds to go through you and not the cloud provider.

Conclusion                                                       

Truthfully, data ownership in the cloud is a complicated issue.  Determined by both government and company policies, data ownership in the cloud is not always retained.  Gladly, depending on data policies and how they categorize data in the cloud, a user could be granted full ownership. In the event that this doesn’t happen, prepare for instances of third-party access and infringement of complete privacy, hence rethink your business strategy. In short, as a cloud services client, please pay attention to the contract that you sign with your provider and understand the laws under which the provider operates.

Author: Davis Porter

HIPAA Prescribed Safeguards for File Sharing

health care data governance

The Health Insurance Portability & Accountability Act (HIPAA) sets standards for protecting sensitive data of patients in the cloud. Any company which is dealing with PHI (protected health information) needs to ensure all of the required network, physical, and process safety measures are properly followed. If you want to learn more about requirements of HIPPA, click here to learn more.

This includes CE (covered entities), anyone who is providing treatment, operations, and payment in health care, BA (business associates) with access to patient’s information stored in the cloud or those who provide support in payment, operations, or treatment. Subcontractors and associates of associates need to be in compliance too.

Who needs HIPAA?

The privacy rule of the HIPAA helped address the saving, sharing, and accessing of personal and medical data of individuals stored in the cloud while the security rule is more specifically meant for outlining national security standards to help protect the health data which is received, maintained, transmitted, or created electronically, also known as e-PHI (electronic protected health information).

Technical and physical safeguards

If you’re hosting data with HIPAA compliant hosting providers, they need to have particular administrative, technical, and physical safeguards in place as per the US HHS (Department of Health & Human Services). The technical and physical safeguards which are the most important for services provided by hosts are listed below:

  • Physical safeguards include limited facility control or access with authorized access procedures. All entities need to be HIPAA compliant, need to have policies regarding the use and access of electronic media and workstations. This includes removing, transferring, reusing, and disposing of e-PHI and electronic media.
  • Technical safeguards should only allow authorized users to access e-PHI. Access control will include the use of unique user ID’s, emergency access procedures, automatic logoffs, decryption, and encryption.
  • Tracking logs or audit reports need to be implemented in order to keep a record of activity on software or hardware. This is very useful when it comes to pinpointing the source or the cause of security violations.
  • Technical policies need to be used for covering integrity controls and measures should be in place to confirm e-PHI has not been destroyed or altered. Offsite backup and IT disaster recovery are very important in order to ensure any electronic media errors or failures can quickly be remedied, and health information of patients can be recovered intact and accurately.
  • Transmission or network security is the last safeguard needed of HIPAA compliant hosts in order to protect them against any unauthorized access or use of e-PHI. This concerns all of the methods for transmitting data, whether it is over the internet, email, or even on private networks, like a private cloud.

A supplemental act passed in 2009 known as the HITECH (Health Information Technology for Economic & Clinical Health) Act which supported the enforcement of all of the HIPAA requirements by increasing the penalties imposed on organizations who violated the HIPAA privacy or security rules. The HITECH Act was created in response to the development of health technology and increased storage, transmission, and use of electronic health information.

HIPAA has driven a number of healthcare providers to search for solutions that can help them secure cloud data. Medical information is very private, and regulation keeps getting tighter, which means enforcement is also getting tighter. There are a number of healthcare providers have chosen to move their whole EHRs onto a HIPAA compliant platform such as FileCloud in order to reduce their expenses and become more inter-operable across various other devices in a safe, HIPAA-compliant fashion.

 

Author: Rahul Sharma

images courtesy: freedigitalphotos.net/ Stuart Miles

Comparison Between FTP And The Cloud 

What Is FTP?

File Transfer Protocol (FTP) is a standard network protocol that is used to transfer computer files between a server and client on a computer network. Since it is built on a client-server model architecture, FTP uses separate control and data connections between the client and the server.

Usually, FTP users authenticate themselves with a clear-text sign-in protocol, that basically comprises of a username and password. In the event that the server is configured to allow it, users can also connect anonymously.

What Is The Cloud?

Generally, the cloud could be defined as computing on the internet. We can also define it as the practice of using remote servers hosted in the internet to store, manage and process data, as opposed to using local servers.

To access the cloud, a user only needs a computer with a web browser and of course internet connection. Better yet, the cloud allows you to remotely access all data and software from anywhere using any device, without having to store anything on their computer.

While the cloud and FTP are all methods of file sharing, they have significant similarities and differences as we will discuss below:

Similarities Between FTP and the Cloud

Both are online file transfer or sharing modes, with cloud computing featuring more advanced and secure features. Actually, FTP can be considered as an integral part of cloud computing because cloud services also offer the option to upload and share your files from anywhere, and opt to store these files in your computer.

Both FTP and cloud services are costly, with differences dependent on servers for example. You can also enjoy free FTP or cloud services.

Differences Between FTP and the Cloud

  • Mode of access

To use FTP, you require an FTP client to access your files and folders. On the other hand, all you need is a web browser or an application to access files and software in the cloud.

  • Cost

It will cost you $5000 to set up your own FTP server, exclusive of maintenance charges. Following reports that cloud computing price for enterprises has dropped by two-thirds since 2014, users can enjoy cloud services for as low as $0.12 per hour.

  • Security

FTP is reportedly insecure especially due to the fact that it offers no traceability. There is no way to see who accessed what information. This loophole has made it easy for cyber criminals to hack into FTP servers, retrieve shared information and leave without a trace.

Cloud services have solved the issue of security in many ways such as the ability to track and report access to your files. Additionally, you are able to create backups in case of data loss.

Why The Cloud Is A Better Option

  • More secure. You can store your confidential information with reduced risk of hacking incidents. No wonder more than 50% of businesses say that their organization currently transfers sensitive and confidential information to the cloud.
  • It is cheaper yet has more advanced features
  • Remote accessibility
  • Cloud is cost-effective for your business. It has been said 80% of cloud adopters saw improvements within 6 months of moving to the cloud. In fact, 82% of companies have reportedly saved money by moving to the cloud.

5 Reasons FileCloud Is The Best Option

File cloud is a private cloud hosting solution, meaning that it resides on your company’s intranet or your hosted data center and is protected behind a firewall. With this on-premises Enterprise File Sharing and Sync (EFSS) solution, your servers are safely run by your own trusted administrators, using your infrastructure.

Here are 5 reasons why you should install FileCloud:

  1. Secure

File cloud is regulated by corporate IT security policies which ensure that you receive the highest form of possible protection. The biggest advantage with private cloud is the ability and ease to be compliant with Sarbanes Oxley, PCI and HIPAA.

While only 26% of cloud services meet the EU data residency rule for example, FileCloud makes it easier for you by giving cloud control to your internal IT department.

  1. You Can Easily Share Files

As a cloud service, FileCloud allows you to map and sync files, which can then be accessed as though they were within your local drive. FileCloud consists of an intuitive UI that enables you to share your documents with your clients and edit them real-time, saving you the hassle of sending emails back and forth.

  1. You Can Easily Access Your Files

You don’t have to always travel with your laptop in order to access and edit your documents. This is because FileCloud developers have enabled the service to allow you to remotely access your files from any location, using different types of devices. Additionally, the developers have made sure that you can sync your documents and access them offline.

Better yet, you can use different avenues to access these documents, using WebDAV, desktop sync, virtual drive and mobile app among others.

  1. FileCloud Offers Best Mobile Connectivity

With an average rating of 4+ stars across all platforms such as Android, iOS and Windows phone, FileCloud app, allows you to effortlessly work on your documents on your mobile phone, despite the size of the document.

  1. FileCloud is Cheaper

Compared to other cloud services such as dropbox, FileCloud guarantees your business savings of up to $42,000 annually. For example, a company that implements FileCloud to serve 500 users will save $42,000 annually while a company that serves 25 users will save $2000 annually, making FileCloud 3 to 4 times cheaper than Dropbox.

Final Word

While FTP is designed to help you share files across computers, eliminating the need to carry devices like flash drives around, the cloud offers a better solution. It is more advanced, primarily designed with your file security in mind and ensures that your documents are safe, with very minimal risk of data breach.

If you are looking for a secure FTP replacement, click here to learn more about FileCloud – a modern alternative to FTP. Make your shift from FTP to the cloud seamless by installing FileCloud today. Develop your content in one location and edit it on the go from any device, anywhere in the world.

Author: Davis Porter

Why Universities Pick FileCloud for File Sharing and Sync?

university cloud file sharing

The cloud has graduated from a trend to be analyzed, to a reality that must be embraced. Many organizations have begun implementing these new technologies to further reduce costs through reduced infrastructure costs, time, and administration and improved machine utilization. We have already seen how multiple sectors can benefit from adopting cloud-based enterprise solutions including financial services, the public sector, and charities among others. So are institutions of higher education ready to embrace the cloud? Is this a key vertical missing out on seeing cloud computing reduce IT infrastructure costs?

The basic idea behind cloud computing is combining computing resources – be that applications, storage, servers or networks – in a shared pool. It is one of the new innovation topics in the field of IT that can benefit enterprises as well as organizations.

Higher Learning: Reach for the Clouds

Higher education undoubtedly plays a crucial role in the molding of the structure of our societies. Only competent higher education institutions can produce responsible individuals who will be instrumental in ensuring that the society grows as it should. The only way to guarantee that these venerable institutions of higher education are on par with the latest global and social trends is constantly updating their technologies to ensure that it is developing alongside the latest technological advances all over the globe.

Cloud computing has completely altered the way institutions do business and serve customers. For higher education, it grants the ability to serve not just administrators and educators, but also students, who have their own technological devices, needs and expectations. By eliminating the usual IT constraints – like limited maintenance resources, costly infrastructure improvements, and incompatibilities between tools and systems, cloud computing frees up institutions of higher learning to shift their focus from maintaining IT infrastructure to addressing student and staff’s needs.

Consumer grade services such as Google Apps, Google Mail or Dropbox are growing in acceptance despite being considered insecure, because they are cost saving alternatives. However, choosing a public cloud offering can create risk around security, interoperability, performance, and privacy. Universities should consider on-premise cloud deployment models to realize lower costs, scale on demand, and rapid platform deployment while minimizing risk.

The State of Higher education in the cloud

While administrators have cautiously approached cloud computing in the past, the trend has since changed.  More institutions of higher learning are adopting cloud services to address varying challenges. According to MeriTalk, the size of the cloud market in higher education is worth $4.4 billion. This further reflected in the findings below:

  • >20%: Percentage of higher education institutions using cloud-based platforms for core enterprise applications
    Source: Ovum
  • 31%: Of colleges that adopt cloud-based solutions do so for storage; 29% for messaging and video conferencing; 25% to allocate computing power
    Source: CDW
  • 36%: Percentage of universities that have uploaded at least 1TB of data to the cloud; 12% have uploaded between 10TB to 100TB of data
    Source: Xsede
  • 33%: Average percentage of budget that higher education IT teams expect to spend on the cloud in four years (as of 2013)
    Source: CDW

Like enterprises that are rapidly adopting cloud services, universities are in search of improved and more cost-effective ways of implementing IT services, without the frustrating cost of upgrades and maintenance. Like enterprises, universities want to realize the full potential of their data to make informed strategic decisions. And like enterprises, universities want to respond more quickly to new opportunities, without taking several months to implement business-critical software applications.

The Unique Needs of Higher Education

Despite the similarities, institutions of higher learning have a unique mission and culture that affects how decisions regarding cloud computing are made.

  • Student Centric. The focus of every higher learning institution is the students. In this digital age of mobile devices; students are likely to bring their own devices and expectations about when and how they wish to use them. The IT department is tasked with providing improved interoperability between student and campus platforms. In order to maintain high levels of communication and collaboration, students must have round-the-clock access to secure, reliable networks; and the ability to create, deliver, and share content campus-wide on any number of devices.
  • Complex finance models. Most universities have precarious and complex financial models, composed of varying combinations of research grants, public funding, investments, philanthropy and tuition.
  • Participatory decision-making model. Another important piece of the puzzle is the governance of higher education, modeled on a participatory culture the typically precludes –or at least complicates—top-down decision making. Reaching a consensus takes time, especially when stake holders are dealing with decisions about where to allocate limited resources. In such an environment, the need to completely understand the advantages that cloud-based services models can provide becomes apparent.

Developing a cloud strategy for strategy requires drawing certain principles from the business community. However, creating a durable and effective digital strategy also requires creating a framework built around the needs of your unique stake holders – from faculty to students, alumni to board members; engaging every stakeholder early in the process; and creating a strategy that addresses IT challenges that are specific to higher education and your campus.

Private clouds are built for the exclusive use of one organization, offering the utmost control over quality of service, security and data.

The University of South Carolina Chooses FileCloud

The director of IT services at the University of South Carolina was in search of a file sharing solution with the following requirements.

  • Store large training videos and share with athletes and coaches
  • Large storage at affordable price
  • On-premise cloud to protect intellectual Property and prevent IP getting to public cloud
  • Mobile apps to support BYOD (bring your own device)
  • Granular controls to provide various levels of permission

After trying FileCloud for 15 days, they decided it was the best solution for the university. Read the full case study here.

Why FileCloud?

  • Reliable Remote Access

In this digital age of consumer technologies, social applications and smartphones, expectations from all constituents—staff, faculty, and students—have never been higher. The need for quick informed 24/7 service has become apparent. Students are bringing multiple devices to campus; and in such an environment, quick access to content and collaboration tools that augment research, teaching and learning is of great importance.

FileCloud enables all constituents to remotely access data from anywhere using any mobile device. The real-time sync across network folder allows for remote access. With comprehensive features for WebDAV, Mobile Apps, Virtual Drive, Desktop Sync and Web access, end users are bound to have a seamless experience regardless of the device they are using. FileCloud clients are available for iOS, Android, Blackberry, and Windows. Additionally, remote files can be opened and edited via third party applications on your device. Stored files can also be shared via link or email attachment.

  • Expanded Capabilities

FileCloud enables even smaller institutions with limited resources and leanly-staffed IT departments to leverage the resources of the cloud; providing more security for critical systems and more robust disaster recovery and business continuity. With features like automated server backup, high availability and multi-tenant setups, FileCloud can easily be scaled to large deployments. Transitioning to FileCloud gives IT staff a framework for security programs, disaster recovery and core business continuity.

  • Cost Containment

FileCloud offers simple pricing options. Adopting a cloud strategy does not automatically mean dramatically reduced IT costs; however with proper management costs can be contained over time. With unlimited client accounts for external contractors and vendors you will only have to pay for employee accounts.

Click here to find out why FileCloud is the leading private file access, sync and sharing solution for institutions of higher learning.

Conclusion

The question of whether or not institutions of higher learning should adopt the cloud has already been answered – by students, staff and faculty, who have already started using cloud services in one form or the other. IT teams in higher education should stop asking if but when and how they can start the transition. The successful adoption of cloud services hinges on finding a partner who understands higher education and the sea of benefits it can deliver.

Learn more about FileCloud, a leading enterprise file sharing and file sync software.

FileCloud High Availability Architecture

Enterprise Cloud Infrastructure is a Critical Service

The availability of enterprise hosted cloud services has opened huge potential for companies to effectively manage files. The files can be stored, shared, exchanged within the enterprise and with their partners efficiently while keeping existing security and audit controls in place. The service provides the power and flexibility of public cloud while maintaining the data control.

The main challenge of enterprise hosted cloud services is to guarantee high uptime (in the order of seven nines) while maintaining high quality of service. The dependency on such services means that any disruption to the service can have significant productivity impacts. Enterprise cloud services typically consist of multiple different services to provide the functionality and any High availability architecture must take into account that all critical services need to have redundancies built into them to be effective. Moreover, detection and handling of failures must not require any user interaction as well as be reasonably quick.

FileCloud Enterprise Cloud

FileCloud enables enterprises to seamlessly access their data using a variety of external agents. The agents can be browsers, mobile devices, client applications, while the data that is enabled for access by FileCloud can be stored locally or in internal NAS devices or in public cloud locations such as AWS S3 or OpenStack SWIFT.

Depending on the specific enterprise requirements, the FileCloud solution may implemented multiple different software services such as Filecloud Helper service, Solr service, virus scanner service, Open Office service etc. Moreover, FileCloud may use the enterprise identity services such as Active Directory or LDAP or ADFS services. Any failure on any of these services can impact end user experience.
FileCloud HA

High Availability Architecture

FileCloud solution can be implemented using the classic three tier high availability architecture. The first tier will consists of the load balancer and access control services. Tier 1 will be a web tier made up of load balancers. Tier 2 will be stateless application servers and for FileCloud implementation, this layer will consist of Apache nodes and helper services. Tier 3 will be the database layer. Any other dependencies such as Active Directory or Data servers are not addressed here.  The advantage this architecture is separation of stateless components from state full components allowing great flexibility in deploying the solution.
AD tiers

Tier 1 – Web Tier

Tier 1 is the front end of the deployment and act as the entry point to all external clients. The components in Tier 1 are stateless and primarily forward the request to the webservers in tier 2. Scaling of the web tier can be done by adding and removing load balancer instances since they are stateless. Each webserver node is capable of handling any request. This layer can also be configured to do SSL offloading allowing lighter weight communication between Tier1 to Tier2. This layer can also be configured to provide simple affinity based on source and destination addresses. The traffic will be forwarded to healthy application server nodes.  This layer also monitors available application servers and will automatically distribute the traffic depending on the load.

Tier 2 – Application Servers

Tier 2 in FileCloud deployment consists of the following services

  • Apache servers
  • FileCloud helper
  • Antivirus service
  • Memcache service
  • Open Office service

The apache servers in FileCloud do not store any state information and are therefore stateless. They however do cache data for faster performance (such as convert and cache documents for display). They primarily execute application code to service a request. All state specific data is stored in database tables and therefore are stateless. If an application server node fails, the request can be handled by a different application server node (provided the clients retry the failing request). Capacity can be increased or reduced (automatically or manually) by adding or removing apache server nodes.

FileCloud helper service provides additional capabilities such as indexed search, NTFS permission retrieval etc.  FileCloud Helper is a stateless service and therefore can be added or removed as needed.

Similar to FileCloud helper service, the Antivirus service is also a stateless service providing antivirus capability to FileCloud. Any file that is uploaded to Filecloud is scanned using this service.

Memcache service is an optional service that is required for local storage encryption. This service is also stateless and is required only if local storage encryption is required. This service is also started in same node as the Apache service.

Open office service is an optional service that is required for creating document file previews in browser. This server is stateless and is started in the same node as the Apache server.

Tier 3 – Database Nodes

Tier 3 consists of state full services. This consists of the following services

  • MongoDB servers
  • Solr Servers

The High availability for each of these servers varies depending on the complexity of the deployment. The failure of these services can have limited or system wide impact. For example, MongoDB server failure will result in FileCloud solution wide failure and is critical, while FileCloud helper server will only impact a portion of function such as network folder access etc.

MongoDB Server High Availability

MongoDB servers store all application data in FileCloud and provide High Availability using replica sets. The MongoDB replica set configuration provides redundancy and increases data availability by keeping multiple copies of data on different database services. Replication also provides fault tolerance against the loss of a single database server. It is also possible to configure Mongo DB to increase the read capacity. The minimum number of nodes needed for Mongo DB server HA is a 3 node member set (It is possible to also use 2 nodes + 1 arbiter).  In case of primary Mongo DB server node failure, one of the secondary node will failover and will become primary.

The heartbeat time frame can be tuned depending on system latency. It is also possible to setup the Mongo DB replica to allow reads from secondary to improve read capacity.
HA Architecture Primary Secondary

Putting It All Together

The three tier structure for FileCloud component is shown below. The actual configuration information is available in FileCloud support. This provides a robust FileCloud implementation with high availability and extensibility.  As new services are added to extended functionality, the layer can be decided whether or not they are stateless or store state. The Stateless (Tier 2) nodes can be added or removed without disrupting service. Tier 3 nodes will store state and require specific implementation depending on the type of service.
HA Architecture

Alternative to WatchDox – Why FileCloud is better for Business File Sharing?

WatchDoxVsFileCloud

FileCloud competes with WatchDox for business in the Enterprise File Sync and Share space(EFSS). Before we get into the details, I believe an ideal EFSS system should work across all the popular desktop OSes (Windows, Mac and Linux) and offer native mobile applications for iOS, Android, Blackberry and Windows Phone. In addition, the system should offer all the basics expected out of EFSS: Unlimited File Versioning, Remote Wipe, Audit Logs, Desktop Sync Client, Desktop Map Drive and User Management.

The feature comparisons are as follows:

Features WatchDox
On Premise
File Sharing
Access and Monitoring Controls
Secure Access
Document Preview
Document Edit
Outlook Integration
Role Based Administration
Data Loss Prevention
Web DAV
Endpoint Backup
Amazon S3/OpenStack Support
Public File Sharing
Customization, Branding
SAML Integration
Anti-Virus
NTFS Support
Active Directory/LDAP Support
Multi-Tenancy
API Support
Application Integration via API
Large File Support
Network Share Support Buy Additional Product
Mobile Device Management
Desktop Sync Windows, Mac, Linux Windows, Mac
Native Mobile Apps iOS, Android, Windows Phone iOS, Android
Encryption at Rest
Two-Factor Authentication
File Locking
Pricing for 20 users/ year $999 $3600

From outside looking-in, the offerings all look similar. However, the approach to the solution is completely different in satisfying enterprises primary need of easy access to their files without compromising privacy, security and control. The fundamental areas of difference are as follows:

Feature benefits of FileCloud over WatchDox

Unified Device Management Console – FileCloud’s unified device management console provides simplified access to managing mobile devices enabled to access enterprise data, irrespective of whether the device is enterprise owned, employee owned, mobile platform or device type. Manage and control of thousands of iOS and Android, devices in FileCloud’s secure, browser-based dashboard. FileCloud’s administrator console is intuitive and requires no training or dedicated staff. FileCloud’s MDM works on any vendor’s network — even if the managed devices are on the road, at a café, or used at home.

Amazon S3/OpenStack Support Enterprise wanting to use Amazon S3 or OpenStack storage can easily set it up with FileCloud. This feature not only provides enterprise with flexibility to switch storage but also make switch very easily.

Embedded File Upload Website Form – FileCloud’s Embedded File Upload Website Form enables users to embed a small FileCloud interface onto any website, blog, social networking service, intranet, or any public URL that supports HTML embed code. Using the Embedded File Upload Website Form, you can easily allow file uploads to a specific folder within your account. This feature is similar to File Drop Box that allows your customers or associates to send any type of file without requiring them to log in or to create an account.

Multi-Tenancy Support – The multi-tenancy feature allows Managed Service Providers(MSP) serve multiple customers using single instance of FileCloud. The key value proposition of FileCloud multi-tenant architecture is that while providing multi-tenancy the data separation among different tenants is also maintained . Moreover, every tenant has the flexibility for customized branding.

NTFS Shares Support – Many organizations use the NTFS permissions to manage and control the access permissions for internal file shares. It is very hard to duplicate the access permissions to other systems and keep it sync. FileCloud enables access to internal file shares via web and mobile while honoring the existing NTFS file permissions. This functionality is a great time saver for system administrators and provides a single point of management.

Conclusion

Based on our experience, enterprises that look for an EFSS solution want two main things. One, easy integration to their existing storage system without any disruption to access permissions or network home folders. Two, ability to easily expand integration into highly available storage systems such as OpenStack or Amazon S3.

WatchDox neither provides OpenStack/Amazon S3 storage integration support nor NTFS share support. On the other hand, FileCloud provides easy integration support into Amazon S3/OpenStack and honors NTFS permissions on local storage.

With FileCloud, enterprises get one simple solution with all features bundled. For the same 20 user package, the cost is $999/year, almost 1/4th of WatchDox.

Here’s a comprehensive comparison that shows why FileCloud stands out as the best EFSS solution.

Try FileCloud For Free & Receive 5% Discount

Take a tour of FileCloud

A Comprehensive Guide to Cloud Containers

Definitely, one of the hottest topics in the world of cloud computing is cloud containers. This evolving technology is changing the way IT operations are conducted just as virtualization technology did a couple of years ago. However, the use of containers is not an entirely new concept. Like VM technology, containerization also originated on big iron systems. The ability to create running instances that abstracts an application from an underlying platform by providing it with an isolated environment has been around since the distributed object and container movement of the 90s with J2EE and Java. The first commercial implementation of containers was pioneered as a feature within the Sun (currently Oracle) Solaris 10 UNIX operating systems.

What are containers?

But the question still remains, what are containers and what role do they play in the cloud? Simply put, a container is an isolated, portable runtime environment where you can run an application along with all its dependencies, libraries and other binaries; it contains all the configuration files needed to run the application. By containerizing the application platform and its dependencies, differences in underlying infrastructure and OS distributions are abstracted away. This makes the application portable from platform to platform with ease.

Despite their subtle similarities, containers are different from VMs in multiple ways. The both offer a discrete, isolated and separate space for applications that creates the illusion of an individual system. However, unlike a VM, a container does not include a full image or instance of an operating system, with drivers, kernels and shared libraries. Instead, containers on the same host can share the same OS kernel, and keep runtimes and other services separated from each other using kernel features referred to as cgroups and namespaces. Containers use up less resources and are more lightweight compared to virtual machines. One server is capable of hosting more containers compared to virtual machines. While a virtual machine will take several minutes to boot up their operating systems and start running the hosted applications, containerized apps can be started almost instantly.

Containerization in the Cloud

Containers mainly add value to the enterprise by bundling and running applications in a more portable way. They can be used to break down applications into isolate micro services, which facilitate enhanced security configurations, simplified management and more granular scaling. In essence, containers are positioned to solve a wealth of problems previously addressed with configuration management (CM) tools. However, they are not a direct replacement to CM or virtualization. Virtualization has played a crucial role in enabling workload consolidation in the cloud, subsequently ensuring that money spent on hardware is not wasted. Containerization simply takes it a step further.

The portable nature of containers means they can effectively run on any infrastructure or platform that runs the relevant OS. For developers, containers means saying goodbye to the burdensome processes, limited lifecycle automation, the same old problems with patches and absolutely no tooling integration. A developer can simply run a container on a workstation, create an application within the container, save it in a container image, and then deploy the application on any physical or virtual server running a similar operating system. The basic idea is to build it once and run it anywhere.

Containerization provides mechanisms to hold portions of an application inside, and then distribute them across public or private clouds, from the same vendor or from different vendors. Containers offer deterministic software packaging, this means that the network topology might be different, or the security policies and storage might be different but the application will still run on it.

Along Came Docker

Docker is responsible for popularizing the idea of the container image. The momentum behind it has made Docker synonymous with container technology, as it continues to drive more interest into the cloud. Cloud vendors have also showed interest in using Docker to provide infrastructures that support the container standard. Docker offers a way for developers to package an application and its dependencies in a container image based on Linux system images. All instances basally run on the host systems Kernel, but remain isolated within individual runtime environments, away from the host environment. Once a Docker container is created, it only remains active if active processes are running within the container.

The Docker Engine runs on all the major Linux distributions, including Arch, SuSE, Gentoo, Fedora, Ubuntu, Debian and Red Hat, and soon Windows – Microsoft announced that it will bring Docker container technology to Windows, and introduce Windows Server Containers which will run on Windows Server. Docker has been tested and hardened for enterprise production deployments, its containers are simple to deploy in a cloud. It has been built in a way that it can be incorporated into most DevOps apps, including Ansible, Vagrant, Chef, and Puppet, or it can be utilized on its own to manage development environments.

Docker also offers added tools for container deployments, such as Docker Swarm, Docker Compose, and Docker Machine. At the highest level, Compose facilitates the quick and easy deployment of complex distributed applications, Swarm provides native clustering for Docker, and Machine makes it easy to spin up Docker hosts. Docker has undoubtedly established a container standard with a solid design that works well out of the gate. However, Docker isn’t necessarily the right pick for all applications, it’s important to consider the right ones for its containers/platform.

The other players

Choosing a technology solely based on adoption rate can lead to long-term issues. Exploring all the available options is the best way to guarantee maximum performance and reliability during the lifecycle of your projects.

I. CoreOS Rocket

CoreOS includes an alternative choice to the Docker runtime called Rocket. Rocket has been built for server environments with the most resolved security, speed, composability and production requirements. While Docker has expanded the scope of the features it offers, CoreOS aims to provide a minimalist implementation of a container manager and builder. The Software is composed of two elements: Actool – administers the building of containers and handles container discovery and validation, and Rkt – takes care of the running and fetching of container images.

A major difference between Docker and Rocket is that the latter does not necessitate an exterior daemon, whenever the Rkt component is called forth to run a container, it does so without any delay within the range of its own process tree and cgroup. On the other hand, Docker runtime utilizes a daemon the needs root privileges, this opens up APIs to exploitations for malicious activities, such as running unauthorized containers.  From an enterprise perspective, Rocket may seem like the better alternative due to its increased portability and customization options. Docker is more ideal for smaller teams because it offers more functionality out of the gate.

II. Kubernetes

Kubernetes was created by Google as a helper tool for managing containerized applications across private, public and hybrid cloud environments. It handles deployment, scheduling, maintenance, scaling and operation of nodes within a compute cluster. The load balancing, orchestration, and service discovery tools contained within Kubernetes can be used with Rocket and Docker containers. Simply put, while the container provides the lifecycle management, Kubernetes takes it to the next level by providing orchestration and managing the clusters of containers.

Kubernetes has the ability to launch containers in existing Virtual Machines or even provision new VMs. It does everything from booting containers to managing and monitoring them. System administrators can use Kubernetes to create pods –logical collections of containers that belong to an application. The Pods can then be provisioned within bare metal servers or VMs. Kubernetes can be used as an alternative to Docker Swarm, which provides native clustering capabilities.

Author: Gabriel Lando

Alternative to Varonis Datanywhere – Why FileCloud is better for Business File Sharing?

VaronisDatanywhereVsFileCloud

FileCloud competes with Varonis Datanywhere for business in the Enterprise File Sync and Share space(EFSS). Before we get into the details, I believe an ideal EFSS system should work across all the popular desktop OSes (Windows, Mac and Linux) and offer native mobile applications for iOS, Android, Blackberry and Windows Phone. In addition, the system should offer all the basics expected out of EFSS: Unlimited File Versioning, Remote Wipe, Audit Logs, Desktop Sync Client, Desktop Map Drive and User Management. Let’s look how FileCloud is a better Alternative to Varonis Datanywhere for business file sharing.

The feature comparisons are as follows:

Features Varonis Datanywhere
On Premise
File Sharing
Access and Monitoring Controls
Secure Access
Document Preview
Document Edit
Outlook Integration
Role Based Administration
Data Loss Prevention
Web DAV
Endpoint Backup
Amazon S3/OpenStack Support
Public File Sharing
Customization, Branding
SAML Integration
Anti-Virus
NTFS Support
Active Directory/LDAP Support
Multi-Tenancy
API Support
Application Integration via API
Large File Support
Network Share Support
Mobile Device Management
Desktop Sync Windows, Mac, Linux Windows, Mac, Linux
Native Mobile Apps iOS, Android, Windows Phone iOS, Android, Windows Phone
Encryption at Rest
Two-Factor Authentication
File Locking
Pricing for 750 users/ year ~$19249 ~$39000

From outside looking-in, the offerings all look similar. However, the approach to the solution is completely different in satisfying enterprises primary need of easy access to their files without compromising privacy, security and control. The fundamental areas of difference are as follows:

Feature benefits of FileCloud over Varonis Dataanywhere

Document Quick Edit – FileCloud’s Quick Edit feature supports extensive edits of files such as Microsoft® Word, Excel®, Publisher®, Project® and PowerPoint® — right from your Desktop. It’s as simple as selecting a document to edit from FileCloud Web UI, edit the document using Microsoft Office, save and let FileCloud take care of other uninteresting details in the background such as uploading the new version to FileCloud, sync, send notifications, share updates etc.

Embedded File Upload Website Form – FileCloud’s Embedded File Upload Website Form enables users to embed a small FileCloud interface onto any website, blog, social networking service, intranet, or any public URL that supports HTML embed code. Using the Embedded File Upload Website Form, you can easily allow file uploads to a specific folder within your account. This feature is similar to File Drop Box that allows your customers or associates to send any type of file without requiring them to log in or to create an account.

Unified Device Management Console – FileCloud’s unified device management console provides simplified access to managing mobile devices enabled to access enterprise data, irrespective of whether the device is enterprise owned, employee owned, mobile platform or device type. Manage and control of thousands of iOS and Android, devices in FileCloud’s secure, browser-based dashboard. FileCloud’s administrator console is intuitive and requires no training or dedicated staff. FileCloud’s MDM works on any vendor’s network — even if the managed devices are on the road, at a café, or used at home.

Device Commands and Messaging – Ability to send on-demand messages to any device connecting to FileCloud, provides administrators a powerful tool to interact with the enterprise workforce. Any information on security threats or access violations can be easily conveyed to the mobile users. And, above all messages are without any SMS cost.

Amazon S3/OpenStack Support Enterprise wanting to use Amazon S3 or OpenStack storage can easily set it up with FileCloud. This feature not only provides enterprise with flexibility to switch storage but also make switch very easily.

SAMLFileCloud supports SAML (Security Assertion Markup Language) based web browser Single Sign On (SSO) service that provides full control over the authorization and authentication of hosted user accounts that can access FileCloud Web based interface.

Multi-Tenancy Support – The multi-tenancy feature allows Managed Service Providers(MSP) serve multiple customers using single instance of FileCloud. The key value proposition of FileCloud multi-tenant architecture is that while providing multi-tenancy the data separation among different tenants is also maintained . Moreover, every tenant has the flexibility for customized branding.

Endpoint Backup: FileCloud provides ability to backup user data from any computer running Windows, Mac or Linux to FileCloud. Users can schedule a backup and FileCloud automatically backs up the selected folders on the scheduled time.

Conclusion

Its a no-brainer. FileCloud hands down beats Varonis Datanywhere in feature set and value at 1/2 the price.

Here’s a comprehensive comparison that shows why FileCloud stands out as the best EFSS solution.

Try FileCloud For Free & Receive 5% Discount

Take a tour of FileCloud

How to Use AWS for Disaster Recovery

A disaster is undoubtedly one of the most predominant risks when running a businesses. It refers to any event or phenomenon which interrupts normal businesses processes or finances. In most cases, disasters are triggered by human errors, physical damages resulting from natural occurrences, power outages, networks outages, and software or hardware failure.

To prevent such occurrences and minimize potential damages in case of one, many organizations invest a significant amount of time and resources in strategizing and preparing their company entities. In addition to training employees to handle disasters, companies are also required to implement adequate restoration measures in case of complete system failures. If your company has a typical traditional physical environment, the most effectual strategy of protecting it is duplicating the infrastructure on a secondary platform to ensure spare capacity in case of a disaster- That’s where cloud disaster recovery comes in. According to a 2014 Forrester Research Report, about 19% of organizations have already adopted cloud disaster recovery to cushion themselves against potential damages. A significant majority of the respondents who hadn’t yet implemented it claimed that they are already drawing up plans to do so.

As the most popularly used cloud service, AWS has invested a lot of resources in disaster recovery as a strategy for improving user experience and staying ahead of its competitors. With an infrastructure that is consistently maintained, AWS is always capable of kicking in to support your operations in case of a disaster. Additionally, it’s highly scalable with a pay-as-you-go plan, which opens it up to all types of businesses regardless of their disaster management budgets. To help you comprehend how you can use AWS for disaster recovery, here are some of the main features and their relevance to AWS Disaster Recovery:

Deployment Orchestration

As an organization, you can significantly boost your recovery capability by investing in post-startup software installation/configuration and deployment automation processes. Some of the tools that you could use include:

  • AWS OpsWorks: Built as an application management service, AWS OpsWorks facilitates operation of different types of applications and considerably eases deployment processes in case of a disaster. The service grants users tools necessary for creating an environment based on a series of layers which are configured as application tiers.
  • AWS ElasticBeanstalk: This is a flexible service critical for scaling and deploying a wide range of services and applications built on Docker, Ruby, Python, Node.js, PHP, .NET, and Java.
  • AWS CloudFormation: This allows you to easily build and provision a set of related AWS resources in a predictable and orderly fashion.

Database

Just like Deployment Orchestration, there are three AWS database services which could be leveraged as you create a sustainable disaster recovery framework:

  • Amazon Redshift: This cost effective, fully-managed, fast, petabyte-scale database service is particularly ideal for the preparation phase of your entire disaster recovery strategy. It’s efficacious in data analysis and can be used to duplicate your entire data warehouse and subsequently store it in Amazon s3.
  • Amazon DynamoDB: Just like the former, this NoSQL data warehouse service can be effectively leveraged in the preparation phase to duplicate data to Amazon s3 or DynamoDB within another region. It’s fully managed, fast, and comes with single digit, millisecond latency.
  • Amazon Relational Database Service: Just as its name suggests, this is a user-friendly service optimized for setting up, scaling and operating relational cloud databases. It can used in the recovery phase to execute the production database, or in the preparation phase to store vital data in a running database.

Networking

Managing and modifying network settings is imperative if you need to smoothly shift to a secondary system in case of a disaster. Some of the primary AWS networking features and services that are effectual in this include:

  • Amazon Direct Connect: This service eases the process of building a dedicated network connection between Amazon Web Services and your organization. In case of a disaster, this strategy increases bandwidth throughput, reduces network costs and provides a better, more persistent network experience compared to internet-based solutions.
  • Amazon Virtual Private Cloud: This service allows you to create an isolated, private AWS cloud section where you can manage and operate resources within a determined virtual network. In case of a disaster, you can efficiently use it to push your existing network typology to the cloud.
  • Elastic Load Balancing: ELB is built to split applications and subsequently spread them across different EC2 instances. It’s capable of simplifying the implementation of your disaster recovery plan by pre-locating the load balancer, subsequently revealing its DNS name.

Regional Bases

To safeguard their data, many organizations choose to store their primary backups on sites located far away from their main physical environments. If an earthquake or a large scale computer malware hit the United States for example, businesses with secondary servers positioned outside the country would have a better chance of recovering than ones that don’t.

Amazon Web Services has servers spread out across the globe to cater to such clientele. You can therefore choose to place your disaster recovery data in a separate region from where your primary system is positioned. Some of the regions include Asia Pacific, EMEA and Americas. Due to the sensitivity of government data, there are also special regions which are only applicable to government organizations and China.

With these features, AWS has undoubtedly proven to be one of the most efficient disaster recovery service providers in the market. This list however is incomprehensive- there are many other features which are implemented depending on a user’s disaster recovery strategy. For a fully optimized disaster recovery framework, an organization should consult an expert to analyze its potential risks to subsequently draft a comprehensive disaster recovery plan with all the requisite AWS features.

Author: Davis Porter