Archive for the ‘Security’ Category

Cloud Security Threats That Will Keep CISOs Busy in 2018

 

As cloud computing continues to strengthen its hold over enterprise IT services market, the concerns around organizational readiness to address the growing security challenges also keep on escalating. Invariably, the shared and on-demand nature of cloud services gives way to security risks. Whether it’s a general expansion in the exposed threat surface area, or some very specific cloud computing-related security issues – 2018 will definitely require that enterprise use the services of their CISOs to manage the growing risks. In this guide, we’ve covered the most pressing of these concerns for you to understand and plan your enterprise’s cloud security strategy around.

Lack of Understanding of Shared Security Responsibilities

One of the major problems that hit CISOs hard is the realization that their cloud service provider is not exactly 100% responsible for the complete security of the workload. Enterprises believe that since their workloads are being managed in the cloud, they can simply forget the security aspects of the same. The truth, however, is that cloud service providers are not responsible or under any obligation for ensuring the security of the workload beyond what the contract mentions. Data retention, backup, security, and resilience – all come in the purview of the enterprise’s responsibility for cloud security, and not that of the vendor. CISOs would do well to understand the cloud service vendor’s model of shared security responsibility. Almost always, companies need to implement extended and additional security measures to secure their cloud data.

Insiders with Malicious Intents

All it takes is a disgruntled employee to bring down the IT systems of an enterprise; that’s sad but true. Because the average enterprise has more than a few cloud computing service vendors, it also means your employees have that many cloud-based applications to manage their work around. Single sign-on is a practical option for companies. However, it also means that malicious insiders can use their position to access and mess up applications.

To make sure that their cloud apps remain secure, enterprises need to invest in access and identity management processes and capabilities. Also, they need to work with cloud vendors to implement behavior analysis based alert mechanisms. These mechanisms can identify suspicious user behavior and trigger alerts, apart from blocking access upon detection.

Failure in Due Diligence while Hiring and Onboarding New Cloud Service Vendors

More and more business applications are now being delivered via the cloud. This obviously means that IT managers will find themselves in boardrooms, being pitched dozens of cloud solutions.

Here, due diligence could be a deal maker or breaker as far as the success of the enterprise-vendor relationship goes. CISOs have a very clear role to play here and must be closely associated with the IT vendor onboarding process. Now is the perfect time to start working on building a thorough checklist of pre-requisites that vendors must meet to qualify for your company’s business. CISOs also must work along with their counterparts from active vendors to ensure the right fit among systems from both sides.

A missed step in the due diligence before signing off a contract with a cloud vendor could come back to haunt your company very soon.

Human Errors

Though enterprises strive to make their IT and business applications immune against user errors, the risks remain real. Also, users of cloud-based applications are always on the radar of cybercriminals. CISOs have to ask themselves – are the end users sufficiently secure against phishing and social engineering attacks?

To make sure that a naive employee doesn’t end up being the cause of an application outage, CISOs need to lead IT efforts towards improving the cybersecurity knowledge of end users.

Insecure Application Programmer Interfaces

Application programmer interfaces (APIs) are key enablers of integration of cloud services with all kinds of on-premise and third-party applications that a business uses. In the very recent past, there’s been a lot of focus on delivering advanced APIs to enterprises to enable them to self-service requests. APIs are also the system components where monitoring, management, and provisioning can be managed by users.

In 2018, it’s expected that the range and capabilities of APIs will expand, bringing more enterprise IT consultants and technicians within the purview of API relevant user groups. This, however, must be done with close oversight of the CISO or one of his/her close aides. The reason – APIs invariably become contribute to the threat surface area of enterprise cloud infrastructure. Companies need specific additional measures to prevent deliberate or accidental attempts to circumvent policies.

Account Hijacking

Though account hijacking is not something specifically associated with cloud computing, it’s certain that cloud computing does add a lot to the threat surface area. The reason is that cloud services are accessed via user accounts, and each new account becomes a risk variable in the cloud security equation. If hackers are able to hijack a user account, they can use the credentials to:

  • Record transaction information
  • Manipulate data
  • Eavesdrop on business communications
  • Redirect users to suspicious websites
  • Execute advanced phishing attacks on hundreds of owners of similar accounts
  • Access critical cloud computing settings and configurations
  • Block legitimate access requests
  • Return false information to data requests

Advanced Persistent Threats

Like a parasite, some cyber attacks persist for long duration, attempting to infiltrate target systems and establish a stronghold within the IT processes and workloads of the victim systems. The worse part of APT attacks is that they stealthily grow aware of evolving security measures and can alter their responses accordingly. Once APTs become a part of a system, they can move laterally and start stealing information from cloud workloads.

Concluding Remarks

As more data and more applications move to the cloud, the role of the enterprise CISO in ensuring security becomes crucial. 2018 will throw all kind of security challenges at enterprises, particularly related to cloud infrastructure. The threats mentioned in this guide are the ones that warrant the CISO’s attention.

Personal Data Breach Response Under GDPR

personal data breach

Data security is at the heart of the upcoming General Data Protection Regulation (GDPR). It sets strict obligations on data controllers and processors in matters pertaining data security while concurrently providing guidance on the best data security practices. And for the first time, the GDPR will introduce specific breach notification guidelines. With only a few months to go until the new regulations come into effect, businesses should begin focusing on data security. Not just because of the costs and reputational damage a personal data breach can lead to; but also because under the GDPR, a new data breach notification regime will be applied to statute the reporting of certain data breaches to affected individuals and data protection authorities.

What Constitutes a Personal Data Breach Under GDPR?

GDPR describes A personal data breach as – a security breach that leads to the unlawful or accidental loss, destruction, alteration, or unauthorized disclosure of personal data stored, processed or transmitted. A personal data breach is by all means a security incident; however, not all security incidents require the same strict reporting regulations as a personal data breach. Despite the broad definition, it is not unusual in data security laws that require breach reporting. HIPAA, for example, makes the same distinctions at the federal level for medical data. It aims to prevent data protection regulators from being overwhelmed with breach reports.

By limiting breach notifications to personal data (EU speak for personally identifiable information – PII), incidents that solely involve the loss of company data/ intellectual property will not have to be reported. The threshold to establish whether an incident has to be reported to a data protection authority is dependent on the risk it poses to the individuals involved. High risk situations are those that can potentially lead to the significant detrimental suffering – for example, financial loss, discrimination, damage to reputation or any other significant social or economic disadvantage.

…it should be quickly established whether a personal data breach has occurred and to promptly notify the supervisory authority and the data subject.

– Recital 87, GDPR

If an organization is uncertain about who has been affected, the data protection authority can advise and, in certain situations, instruct them to immediately contact the individuals affected is the security breach is deemed to be high risk.

What Does The GDPR Require You to Do?

Under GDPR, the roles and responsibilities of processors and data controllers have been separated. Controllers are obliged to only engage processors who are capable of providing sufficient assurances to implement appropriate organizational and technical measures to protect the rights of data subjects. In the event of a data breach that affects the rights and freedoms of said data subjects, the organization should report it, without any delay and, where practicable, within 72 hours of becoming aware of it.

The data processor is mandated to notify the controller the moment a breach is discovered, but has no other reporting or notification obligation under the GDPR. However, the 72-hour deadline begins the moment the processor becomes aware of the data breach, not when the controller is notified of the breach. A breach notification to a data protection authority has to at least:

  1. Have a description of the nature of the breach, which includes the categories and number of data subjects affected.
  2. Contain the data protection officer’s (DPO) contact information.
  3. Have a description of the possible ramifications of the breach.
  4. Have a description of steps the controller will take to mitigate the effect of the breach.

The information can be provided in phases if it is not available all at once.
If the controller determines that the personal data breach can potentially put the right and freedoms of individuals at risk, it has to communicate any information regarding the breach to the data subjects without undue delay. The communication should plainly and clearly describe the nature of the personal data breach and at least:

  1. Contain the DPO’s contact details or a relevant contact point.
  2. Have a description of the possible ramifications of the breach.
  3. Have a description of measures proposed or taken to mitigate or address the effects of the breach.

The only exception in this case is if the personal data has been encrypted, and the decryption key has not been compromised, then there is not need for the controller to notify the data subject.

The most ideal way for companies to handle this GDPR obligation is to not only minimize breaches, but also, establish policies that facilitate risk assessment and demonstrates compliance.

The GDPR stipulates that all the records pertaining the personal data breach, regardless of whether the breach needs to be reported or not. Said records have to contain the details of the breach, any consequences and effects, and the follow up actions taken to remedy the situation.

Should Ransomware Attacks Be Reported?

Ransomware typically involves the ‘hijacking’ of cooperate data via encryption and payment is demanded in order to decrypt the ransomed data. Under GDPR, Ransomware attacks may be categorized as a security incident but it does not necessarily cross the threshold of a personal data breach. A Ransomware attack would only be considered a personal data breach if there is a back up but the outage directly impacts user’s freedoms and rights, or if there is no back up at all. Ideally, a Ransomware attack where the ransomed data can be quickly recovered does not have to be reported.

What Are the Consequences of Non-Compliance?

A failure to comply with the GDPR’s breach reporting requirements will not only result in negative PR, constant scrutiny, and possibly loss of business; but will also attract an administrative fine of up to € 10 million or up to two percent of the total global annual turnover of the preceding financial year. Additionally, failure to to notify the supervising authority may be indicative of systematic security failures. The would show an additional breach of GDPR and attract more fines. The GDPR does have a list of factors the supervising authority should consider when imposing fine; chief among them being the degree of co-operation by the data controller with protection authority.

In Closing

Data breach notification laws have already been firmly established in the U.S. These laws are designed to push organizations to improve their efforts in the detection and deterrence of data breaches. The regulators intentions are not to punish but to establish a trustful business environment by equipping organizations to deal with with security issues.

Author: Gabriel Lando

image courtesy of freepik

FileCloud Announces Integration With Duo – Enhances the Security With 2FA

 

You’ve probably been investigating 2-Factor Authentication (2FA) more recently. With each new data breach in the news, you increasingly realize that security doesn’t end with strong passwords.

Two Factor Authentication,  also known as 2FA,  is a two-step verification method that requires a username/password and the second method of verification. Duo is a cloud-based SaaS service that can authorize 2FA across any organization. Duo simplifies the management of end users and their 2FA devices.This support allows FileCloud 2FA management via DUO for clients who already use DUO to manage their other enterprise applications.

FileCloud is trusted by 1,000s of organizations to store critical files and data. Since FileCloud deals with mission-critical business data, we consider security as the most important vector. FileCloud already offers 2FA through Google and mail authentication. With our new version of FileCloud, you can integrate with Duo to offer 2FA and enhance the security when users access FileCloud.  Duo adds an extra layer of protection to your FileCloud account.  Once enabled, FileCloud will require a passcode in addition to your user id/password whenever you log in to FileCloud.

What is 2FA?

2FA adds an extra layer of protection to user logins by combining the use of “something you know” (your login credentials and password) and “something you possess” (One Time Passcode).  Many consumer emails and online banking applications now incorporate this additional layer of account security. For most applications using 2FA, it is most common for users to retrieve a passcode from their cell phone, smartphone,  or another smart device in order to access their account.

While SSO is convenient for users, it presents new security challenges. If a user’s primary password is compromised, attackers may be able to gain access to multiple resources. In addition, as sensitive information makes its way to cloud-hosted services, it is even more important to secure access by implementing two-factor authentication.

Are text-based 2FAs enough?

The problem with 2FA is that often a distinction isn’t made between SMS-based 2FA, which sends a code to the user via text, and 2FA that requires a user to respond to a push verification sent to a specific physical device.

Text-based 2FA spreads out the potential attack surface. Instead of a code being sent to one place — like a purpose-built smartphone app or a separate authenticator device — it’s distributed throughout a set of services that might have their own vulnerabilities. A true two-factor authentication, the good kind, sends a verification prompt to one place: the device you’re holding in your hand.

Duo Security and how it works with FileCloud

We know that the most effective security solution is one your users actually use. Duo is an industry leader that provides users with multiple options to gain access to their account using 2FA. Duo’s 2FA solution only requires our users to carry one device – their phone, with the Duo Mobile app, installed. Duo Mobile is available for iPhones, Androids and more. Duo makes it very simple to protect many different apps due to their Auth API – as long as the apps support Duo. Once you have an Enterprise Plan of Duo, you can protect any sign-in process of on-premises and cloud apps.

FileCloud can be set up to use Duo security service to perform 2FA. After integrating Duo with FileCloud, users will need to install Duo app on their smart device which will provide them with the passcode. Users will, therefore, require using their ID, password and the passcode generated to log in to their FileCloud account. With this added security, your employees can collaborate and store files in this encrypted cloud drive, share data securely within the network or with outsiders, and much more.With the additional Duo 2FA enabled, your business data is protected at the highest possible security level with FileCloud.

Here’s how you can integrate Duo with FileCloud

  1. Add Duo Auth API

Get integration key, secret key, and API hostname using Duo -> https://duo.com/docs/authapi

Enter the information Admin Portal→ Settings→ Misc→Duo Security Tab under Auth API Security Settings and save.

  1. ADD Duo Admin API

Follow instructions at https://duo.com/docs/adminapi to get the integration key, secret key, and API hostname

Ensure it has the “Grant read resource” permission

Enter the information Admin Portal → Settings→ Misc→Duo Security Tab under Admin API Security Settings and save

 

  1. Open the Policies tab and select the policy (select the Global policy if 2FA needs to be default)
  2. Open the 2FA tab of the Policy
  3. Select “YES” to Enable Two Factor Authentication
  4. Select “Duo Security” for Two Factor Authentication Mechanism and save the policy

 

When users want to log into the VPN, it receives a request. The VPN communicates with Duo, which sends a request to the mobile device of the user – the second factor. When the user confirms on the second device, Duo communicates back to the VPN and only then the user is allowed access the network.

To know more about this feature and to learn how to integrate that with your FileCloud account, Click here 

Personal Data, PII and GDPR Compliance

GDPR

 

The countdown for the European Union’s General Data Protection Regulation (GDPR), which will go into full effect in May 2018, is coming to a close. GDPR aims to solidify the data privacy rights of EU residents and the requirements on organizations that handle customer data. It introduces stern fines for data breaches and non-compliance while giving people a voice in matters that concern their data. It will also homogenize data protection rules throughout the EU. The current legislation, the EU Data Protection Directive was enacted in 1995, before cloud technology developed innovative ways of exploiting data; GDPR aims to address that. By enacting strict regulations and stiffer penalties the EU hopes to boost trust within a growing digital economy.

Despite the fact that GDPR came into force on 24th May 2016, organizations and enterprises still have until the 25th of May 2018 to fully comply with the new regulation. A snap survey of 170 cybersecurity pros by Imperva revealed that While a vast majority of IT security professionals are fully aware of GDPR, less than 50 percent of them are getting everything set for its arrival. It went on to conclude that only 43 percent are accessing the impact GDPR will have on their company and adjusting their practices to comply with data protection legislation. Even though most of the respondents we based in the United States, they are still likely to be hit by GDPR if they solicit and/or retain (even through a third party) EU residents’ personal data.

Remaining compliant with GDPR demands, among several other things, a good understanding of what constitutes ‘personal data’ and how it differs from ‘personal identifiable information’ or PII.

What is Personal Data In the GDPR Context?

The EU’s definition of personal data in GDPR is markedly broad, more so than current or past personal data protection. Personal data is defined as data about an identifiable or identified individual, either indirectly or directly. It is now inclusive of any information that relates to a specific person, whether the data is professional, public or private in nature. To mirror the various types of data organizations currently collect about users, online identifiers like IP addresses have been categorized as personal data. Other data such as transaction histories, lifestyle preferences, photographs and even social media posts are potentially classified as personal data under GDPR. Recital 26 states:

To determine whether a natural person is identifiable, account should be taken of all the means reasonably likely to be used, such as singling out, either by the controller or by another person to identify the natural person directly or indirectly. To ascertain whether means are reasonably likely to be used to identify the natural person, account should be taken of all objective factors, such as the costs of and the amount of time required for identification, taking into consideration the available technology at the time of the processing and technological developments.

This personal data term directly applies to all the 28 states in the European Economic Area (EEA)

Is Personally Identifiable Information (PII) the Same as Personal Data?

The term ‘Personally Identifiable Information’ doesn’t appear anywhere in the GDPR; however, it does have a definite meaning in US privacy law. Therefore the term in itself is likely to cause confusion to anyone seeking to comply with GDPR. For a concept that has become ubiquitous in both technological and legal colloquy, PII is surprisingly hard to define. In a nutshell, PII refers to any information that can be used to distinguish one individual from another. This includes any information that can be used to re-identify anonymous data. This can solely refer to data that is regularly used to authenticate/identify an individual, this may be averse to information that violates the privacy of on individual, that is, reveal sensitive information regarding someone. The US interpretation of the term is undeniably incongruous with what is relevant for a proper GDPR assessment since it pre-selects a set of identifying traits.

To put it bluntly, all PII can be considered personal data but not all personal data is Personally Identifiable Information. Developing a solid GDPR compliance program demands that IT architects and marketers move beyond the restricted scope of PII to examine the full spectrum of personal data as defined by the EU.

Handling Personal Data in Accordance With GDPR

The first step to GDPR compliance in matters pertaining personal data is undoubtedly the risk assessment of how existing data is being stored and accessed, the level of risk attached to it, and whether it contains any PII. The data might be stored on server file systems, databases or even on an end user’s physical storage or cache. Becoming GDPR compliant will mean that you are not only protecting more data types in the future but will also involve dissipating more effort in the identification of existing data that initially wasn’t considered personal data. It is important to note that you cannot limit your scope to the data you hold as if it were a closed system. Nowadays, people typically interact with interconnected systems, and GDPR mirrors that. In such scenarios, organizations should focus outward, and infer who in their ecosystem can connect with an attribute to another, from the multiple varying paths to re-identification within their ecosystem.

Additionally, GDPR requires that a document ‘opt-in’ consent must be provided by each individual. The consent has to explicitly pinpoint the data collected, how it is going to be used and how long it will be retained. Organizations also have to provide participants with an option to remove their consent at any given time and request their personal data be permanently deleted. Participants should have the ability to get factual errors amended, and even request their personal data for review and use.

FileCloud Can Help You Comply With GDPR

The General Data Protection Regulation sets a new standard in the protection of personal data. Its efforts aim to grant data subjects more control over their data while ensuring the transparency of operations. FileCloud provides a set of simple features that can help organizations meet GDPR requirements.

Click here for more information.

Author: Gabriel Lando

Image courtesy of freepik.com

Blockchain as a Service (BaaS) for Enterprise – Jump on the Bandwagon?


The democratization of high-speed Internet coupled with the development of distributed information exchanges gave rise to the development of blockchain technology. Blockchain is the underlying technology that powers the crypto-currency Bitcoin; however, its uses transcend that. Simply put, a blockchain is public, shared distributed ledger that stores the complete transaction history of different types of records. The validity, uniqueness, and integrity of the stored data is preserved, without the need for a trusted third party to verify it. As such, blockchain has peaked the interest of several enterprises, especially those in the finance and banking industries. Large tech players such as Microsoft and IBM have begun exploring the opportunities blockchain presents in the form of Blockchain as a Service (BaaS) solutions in order to incorporate blockchain technologies into their cloud offerings.

Its no secret that the industry of blockchain based companies is still relatively young. Its future is currently being shaped by experimentation and R&D partnerships between large corporations and start-ups. The main driver behind the rise of blockchain apps, especially in the enterprise, is directly linked to time and cost efficiencies, that are still far from optimal in most industries.

“ Blockchain holds the promise to fundamentally transform how business is done, making business-to-business interactions more secure, transparent, and efficient”

– Amit Zavery, senior VP of Oracle Cloud Platform

What is Blockchain as a Service ?

A Deloitte survey conducted towards the end of 2016 concluded that Blockchain technology would become a crucial business focus for most industries in 2017. The survey, which involved 308 senior executives who were knowledgeable about blockchain, found that most of them placed blockchain among their organizations’ highest priority. 36 percent were convinced blockchain has the potential to significantly enhance system operations, by either increasing speed or reducing costs. 37 percent recognized blockchain’s formidable security features as the main benefit. The remaining 24 percent were of the opinion that it has the potential to facilitate new revenue streams and business models. While there is a consensus amongst enterprise tech decision makers that blockchain has immense potential to reshape entire industries, the adoption plan is not as clear or direct.

Building enterprise solutions powered by blockchain is not a simple undertaking. The setup and subsequent operation of a blockchain environment involves major development and infrastructure challenges. Blockchain as a Service (BaaS) is an intriguing trend in the blockchain ecosystem that aims to ease adoption for enterprises. The idea behind it is that customers can leverage blockchain cloud solutions to create a network of their own applications and smart contracts while the cloud provider handles all the heavy lifting needed to keep the infrastructure operational.

BaaS provides blockchain capabilities as a first class Platform as a Service (PaaS) services. From a functional perspective, a BaaS model enables developers to create solutions that effortlessly combine the aptness of blockchain with typical infrastructure and platform services like storage, messaging, middle-ware, and other functional building blocks of complex software solutions. Additionally, BaaS facilitates a seamless model to manage and scale a blockchain topology without the deployment of any proprietary infrastructure.

BaaS Market Outlook

Blockchain has gained a lot of momentum over the past few years, with good reason. As of Feb 2017, it was the second most-searched term on Gartners site, after a 400 percent increase in the 12 months prior. This shows an exponentially increasing interest in this rapidly developing market. The entire blockchain market is predicted to grow at an annual growth rate of 61.5 percent by 2021, with immutability and transparency as the driving factors behind the growth. Another thing aiding in the expansion of blockchain’s reach has been the proliferation of blockchain as a service (BaaS) solutions from major providers.

The major BaaS players include:

Microsoft

Microsoft first launched the Azure BaaS in November 2015. In 2016 it furthered its efforts with Project Bletchley blockchain middle-ware/ template, which was aimed at helping partners and customers build private consortium Ethereum networks. Microsoft is trying to aid business figure out the best way to build on top of BaaS with Enterprise Smart Contracts. The blockchain framework and middle-ware were created to help enterprises integrate and build distributed applications. Since Azure is a scalable, flexible and open platform, Microsoft claims to support a growing number of distributed ledger technologies that meet specific technical and business needs for performance, security and operational processes. They also claim the the Cortana intelligent service is capable of providing unique data analysis and management capabilities.

IBM

IBM’s BaaS service is based on the Linux Foundation’s Hyperledger Fabric. Hyperledger is an open source cross-industry effort to introduce blockchain to the enterprise; by utilizing it, IBM hopes that developers will be able to rapidly build and host secure blockchain networks through the IBM cloud. In order to solidify security, IBM blockchain is underpinned by IBM LinuxONE, a security based Linux server. The IBM blockchain platform claims to be the only fully integrated enterprise blockchain platform built to accelerate the governance, development and operation of multi-institution business networks. IBM plans to offer a framework for cooperate blockchain networks, that automatically scales as members are added to it. The company states that its blockchain platform will be capable of supporting large user ecosystems and transaction rates.

Oracle

Shortly after joining the Linux Foundation’s Hyperledger project, Oracle added blockchain as a service to its cloud offering. The plan to launch the service was initially announced when it joined Hyperledger in August 2017. Its goal at the time was to provide an advanced and differentiated enterprise-grade distributed cloud ledger platform for consumers looking to create new blockchain based apps and/or grow their current IaaS, PaaS, SaaS and on-premise applications.

In Closing

Conspicuously missing from the list of major BaaS providers is AWS. In 2016 AWS announced a collaboration with the New York City based Digital Currency Group (DCG), to provide a blockchain (as a service) experimentation environment for enterprises. So that the blockchain providers on the DCG portfolio can work with their clients, who include insurance companies and financial institutions, in a secure environment. However, this strategy is yet to end up with the development on a new BaaS platform within AWS. Scott Mullins, AWS head of worldwide financial services business development, says that their company is closely working with blockchain providers and financial institutions to prompt innovation while facilitating frictionless experimentation. Google has also been relatively quiet it matters concerning blockchain. However considering the direction other PaaS incumbents are going; we are likely to see BaaS capabilities incorporated into Google cloud in the future.

 

Auhtor: Gabriel Lando

Blockchain Beyond Crypto-currencies

Blockchain can disrupt cloud computing

Blockchain goes beyond crypto-currencies

On the 31st of October 2008, the still mysterious Satoshi Nakamoto (probably a pseudonym for an individual or group) published a white paper introducing the concept of a peer to peer digital cash system referred to as Bitcoin. Bitcoin marked a radical shift in the finance industry. It offers enhanced security and transparency by authenticating peers that share the virtual cash, generating a hash value, and encryption. The global financial industry, predicts that the market for security-based blockchain is will grow to roughly $ 20 billion by 2020.

More Than Just Crypto-currencies

Blockchain is widely known for powering crypto-currencies; it is the data structure that enables Bitcoin (BTC) and other upcoming digital currencies like Ether (ETH) to burgeon via a combination of decentralized encryption, immutability, anonymity, and global scale. However, its uses go way beyond that.

In a nutshell, blockchain refers to a continuously updated record of who holds what.

A blockchain is a distributed data repository or ledger that is decentralized and available for everyone to see and verify. In order to understand it in the context of a trust economy; you can equate it to public ledgers that were used to in towns to record important things like the transfer of property deeds or election results. Blockchain simply utilizes advanced cryptography and distributed programming to effectuate similar results. What you have in the end is a system with trust inherently built into it – a transparent, secure, immutable repository of truth; that has been built to be highly resistant to manipulation, outages, and unnecessary complexity. This consistent record of truth is facilitated by the shared and cryptographic nature of the ledger.

Blockchain’s social perception mainly revolves around crypto-currencies. Most people get encumbered by its perceived technological complexity, dismissing it as something for the intellectual tech-savy; but its basic concept is universal and simple. Its immense potential is nothing short of revolutionary. From financial ledgers and contracts to monitoring and securing all manner of data in the next generation of distributed applications.

Blockchain in the Enterprise

Blockchain is creating waves in the enterprise software market, with companies like Microsoft and IBM leveraging it in developer environments, cloud platforms, Internet of things (IoT) technology and more. Ethereum’s blockchain tech has largely been the gateway, nonetheless, tech giants are firmly in the blockchain business. The collective finance and banking industry is also adopting blockchain transactions in the form of smart contracts.

Blockchain was listed as one of the top trends in the Gartner hype cycle for 2017. The hype cycle takes a close look at technologies that have the potential to significantly increase a company’s competitive edge. According to Gartner, this technology will lead to the reformation of entire industries in the long term. Companies that are at the forefront of disruption, view blockchain as the driving force behind it. This was the key takeaway from a study of 3,000 executives, done by IBM’s Institute for Business Value, which examined the enterprise potential of blockchain. The survey concluded that 33% of the executives were already considering or had already adopted it. Most of the surveyed executives were counting on it to provide a competitive advantage – while creating a platform approach to innovation.

Potential In the Cloud

Cloud computing has been widely adopted in virtually every facet of IT; one can’t help but wonder how the decentralized and security features of blockchain technology can be used to further enhance the clouds appeal. Whenever CIOs begin discussing moving critical applications to the cloud, terms like security, compliance, accountability, reliability, auditability, and acceptance of liability among others, are thrown around. The main point of contention lies in the demand that there is a secure supply chain and that each step in that supply chain is verifiable in real-time, and when things go south it is possible to find out what went wrong and someone can be held accountable. Introducing blockchain into cloud computing creates a convenient service that offers enhanced security.

A Decentralized Cloud

A key characteristic of blockchain is that it was designed to be synchronized and distributed across networks. A blockchain based decentralized cloud facilitates on-demand, low-cost, and secure access to some of the most competitive computing infrastructures; while protecting your files, both on the nodes and in transmission, by utilizing encryption and cryptography. A major reservation for organizations when migrating to the cloud is trusting third parties to secure sensitive, private data.

For most cloud experts, the biggest draw to blockchain is the elimination of intermediaries. Mainly due to the fact that a well-designed and publicly accessible blockchain can easily replace most of the functions performed by intermediaries to ensure a secure environment, free of fraud. On a decentralized ‘blockcloud’ , where data is stored on multiple individual nodes intelligently distributed across the globe, it is virtually impossible to cause meaningful disruptions.

Blockchains like Ethereum provide a different approach to running distributed applications. Using Ethereum, developers can write smart contracts – code that is executed on the blockchain virtual machine, whenever a transaction is fired. The Ethereum blockchain inadvertently provides a distributed run-time environment with a distributed consensus over the execution.

In Closing

Blockchain’s power doesn’t lie in its heavy encryption; its distributive nature makes it hard to manipulate. It is essentially a sequential storage scheme that can verify itself, making it the ideal solution for immutably recording transactions and much more. While everyone remains fixated on the AI buzz, the blockchain is a dark horse that is running under the radar.

Author: Gabriel Lando

Image Courtesy of Freepik

Data Security Questions Every Enterprise Should Ask

Over the past decade, cloud computing has transitioned from being a buzzword to becoming a staple technology for most enterprises, mainly driven by cloud’s accessibility, superior flexibility, and capacity compared to mainstream computing and storage techniques. However, just like mainstream data sharing and storage methods, cloud computing does not lack its fair share of data security issues. Palliating data security risks is essential to creating a level of comfort amongst CIOs, to migrate data applications to the cloud. The decision to transition to the cloud has to be dependent on how sensitive the data is and the security guarantees the cloud vendor provides.

Is your data safe in the hands of a cloud service provider?

In today’s exceedingly mobile world, enterprises are heavily relying on cloud vendors, and allowing remote access to more devices than ever before. The end result is a complex network that requires higher levels of security. The only way organizations can maintain the availability, integrity, and confidentiality of these different applications and datasets is by ensuring their security controls and detection-based tools have been updated to work with the cloud computing model. Whenever data is stored in the cloud, the main point of focus is typically the security of the cloud provider and hosting facility. However, this focus is usually at the expense of how the data itself is handled. This begs the question, do you trust the cloud vendor’s technology? Do you trust their employees? Do you trust their safeguards? Are you completely sure that if their back was against the wall they would not sell or compromise any of your data?

The fact of the matter remains that, once you move your data to a public cloud platform, you can no longer exercise your own security controls. Outsourcing also introduces a costly threat to intellectual property in the form of digital information like engineering drawings, source code, etc. An organization has to give its cloud service provider access to important IP assets, which are vital to the organization’s core business. Exposing invaluable information to third parties presents an epoch-making security risk. In most cases, migrating to the cloud means you have no option but to trust the vigilance, knowledge, and judgment of your chosen vendor.

As cloud-based solutions like Dropbox and Google Drive become more popular within the business setting; enterprises have to come to grips with the fact that issues like loss of control over confidential data are a looming security threat. Despite the fact that cloud vendors implement several security measures to isolate tenant environments, the organization still loses some level of IT control, which equates to risk as sensitive data and applications no longer reside within a private, physically isolated data-center. Is the business value worth the risk?

Why is Metadata Security Important?

In a nutshell, metadata is data about data. The bigger question is whether or not metadata is personally identifiable. If enough of it is linked together, a detailed profile of an individual or organization can be created; enough to personally identify them. Most IT security experts agree that metadata typically contains sensitive information, hidden from obvious view, but easily extractable. Metadata poses a great data leak risk since employees are not even aware of its existence. Whenever a request is made to store or retrieve data from a cloud storage server, the request and subsequent response contain metadata about both the request and the data itself. Since the organization has little to no control of this metadata, there is no way to guarantee its security.

What Happens in the event of a data breach?

As cloud adoption rates increase, cloud providers are increasingly becoming attractive targets for cybercriminals because of the huge amounts of data stored on their servers. Access to unencrypted metadata is enough to count as a full-fledged breach. The severity of a data breach is dependent on the sensitivity of the data being exposed. Breaches that involve trade secrets, health information and intellectual property are usually the most direful. It is worth noting that cloud vendors are not subject to similar data breach disclosure laws as federal agencies, banks, and other entities. So if a breach does occur, it may never be publicized or associated with the vendor.

Despite numerous efforts from public cloud providers to implement stringent security measures to curb the risk of data breaches; the burden of responsibility for data security ultimately falls on the organization and a breach will have critical financial and legal consequences.

Who Controls Your Data?

Ensuring that the data and applications residing in the cloud are kept safe is becoming more crucial as high-value data, mission-critical applications and intellectual property is transferred to the cloud. Despite the fact that cloud computing, in general, can be perceived as less secure, the fear of cloud security is situational. The real conundrum shouldn’t be whether or not to migrate to the cloud, but which cloud to migrate to. From a security standpoint, most cloud service providers are not ready. Using unsecured cloud vendors can expose sensitive cooperate data without your organization even realizing it. Enterprises commercially and legally have to maintain control over their data while customers and employees need to be able to freely collaborate, share and sync files they require. The solution is simple! Private Cloud.

Private Cloud Offers a Better Alternative

A private cloud computing model facilitates control and collaboration while protecting confidential data from unauthorized access. IT stakeholders need to have a detailed understanding of where and how data is being stored and transferred. With a self-hosted cloud deployment for critical data, you have maximum control, integration, and configuration of all the layers of security.

  • Flexible Infrastructure

A cloud deployment is considered private when it is hosted on the organization’s servers. However, that does not necessarily mean the servers are hosted on-premises. By going the self-hosted route, companies are able to choose whether they want to house their files on-premises or in a remote data center. Despite the fact that on-premises infrastructure has the added advantage of more control and ownership, you will also be responsible for capacity planning. Given the costs associated with operating a data center and the redundancy required to operate at 100 percent network and power uptime; organizations can opt to leverage a hosted private cloud in the form of Infrastructure as a Service (IaaS) or Platform as a Service (PaaS).

This model allows the organization to have a scalable, isolated computing environment that has been custom-designed to meet its specific workload requirements, with the jurisdiction of their choice. A good example is AWS’ VPC which provides cloud hosting capabilities with enterprise-grade IT infrastructure through a virtualized network of interconnected virtual servers. GovCloud also allows US government agencies to host private clouds in secure regions operated by U.S citizens, and is only accessible to vetted U.S entities.

In a nutshell, a private cloud allows organizations to develop a flexible infrastructure to deliver applications while retaining control and managing the risk of the services delivered to business partners, users, and customers.

  • Maximum Control

A private cloud deployment gives you control over security, privacy, and compliance. You can manage all your applications, IT services, and the infrastructure in one place using powerful tools like application and performance monitoring, VM templates, and automated self-service deployment. Since you have the control from the ground up, you will not be forced to adjust your security processes to meet those of the cloud; instead, you will bend the cloud to your will. A self-hosted cloud lets you leverage your current security infrastructure and procedures and easily integrates with existing tools. It simply works within your set framework; and when your data requirements scale, you will have the ability to scale with them.

The physical location of the data-center plays a crucial role in cloud adoption. A private cloud creates the opportunity to choose the region data will be stored. By having control over your selection of hosting provider/ data center, you know precisely where your servers are located, and under which nation’s data laws they are governed. Organizations may be obliged or simply prefer, to store data in a jurisdiction or country that is not offered by a public cloud provider.

In Closing

A private cloud expands visibility into workloads and cloud operations. Thus enabling IT administrators to design data storage, hardware, and networks in a way that guarantees the security of data and associated metadata. When IT is fully aware of where the data is located and who has access to it at any given moment in time; the risks of compliance violations, data security vulnerabilities, and data leakage are thwarted.

Author: Gabriel Lando

FileCloud Unveils ‘Breach Intercept’ to Safeguard Organizations Against Ransomware

FileCloud, the cloud-agnostic EFSS platform, today announced FileCloud Breach Intercept. The newest version of FileCloud offers advanced ransomware protection to help customers handle every phase of a cyberattack: prevention, detection and recovery.

FileCloud is deployed across 90 countries and has more than 100 VARs and Managed Service Providers across the world. Deployed by Fortune 500 and Global 2000 firms, including the world’s leading law firms, government organizations, science and research organizations and world-class universities, FileCloud offers a set of unique features that help organizations build effective anti-ransomware strategies.

Global ransomware damage costs are expected to total more than $5 billion dollars in 2017, compared to $325 million dollars in 2015. Ransomware is growing at an estimated yearly rate of 350 percent with business enterprises becoming the priority target for hackers. Enterprise File Sharing and Sync (EFSS) solutions have seen an increase in ransomware attacks with 40 percent of spam emails containing links to ransomware. Whereas public cloud EFSS solutions such as Box and Dropbox offer centralized targets for ransomware attacks, FileCloud’s decentralized private cloud reduces your company’s exposure to potential attacks.

“Anyone with access to a computer is a potential threat, and the cloud their personal armory,” said Venkat Ramasamy, COO at FileCloud. “Why rummage through hundreds of houses when you can rob a bank? Hackers target centralized storage such as Dropbox or Box rather than self-hosted FileCloud solutions. The freedom to choose the cloud platform that best meets the unique dynamics of each business is our line in the sand of competitive differentiation.”

Breach Intercept

Cyberdefense via customization

The best defense against a phishing attack is to make sure your employees can differentiate genuine communication from malicious spoofing. Hackers can easily spoof email from public SaaS products, which have a standardized, easily falsifiable format. FileCloud offers unparalleled branding and customization tools, allowing you to set your own policies, and design your own emails and broadcast alerts. Customized emails and UX significantly reduce spoofing risk as hackers can’t run a mass spoofing unless they have an exact copy of an email from one of your employees.

Granular controlled folder access

With FileCloud Breach Intercept, you can set different levels of access between top-level folders and sub-folders. Administrators can set read/write/delete/share permissions for any user at any folder level, and permissions are not necessarily inherited according to folder structure, limiting propagation.

Real-time content / behavior heuristic engine

State-of-the-industry heuristic analysis works to detect threats in real time and suspicious content and user activity will activate security protocols and prevent ransomware from taking hold. For example, if FileCloud detects a file posing as a Word document, the system halts the upload and sends an alert to the administrator, preventing propagation of an attack.

Unlimited versioning and backup to rollback

Unlimited versioning and server backup helps companies recover from any data loss accident, including ransomware. FileCloud can roll back not only employee files but also entire server files to any specific date and time before the attack.
FileCloud is available for immediate download from our customer portal. For more information or to download FileCloud Breach Intercept, please visit https://www.getfilecloud.com/ransomware-protection-for-enterprise-file-share-sync.

 

IT Admin Guide to NTFS File and Folder Permissions

New Technology File System (NTFS)

One of the most important and often misunderstood pieces of functionality in Microsoft Windows is the File and Folder security permissions framework. These permissions not only control access to all files and folders in the NTFS file system, it also ensures the integrity of the operating system and prevents inadvertent and unauthorized changes by the non-admin users as well as by malicious programs and applications.

So let’s begin at the very beginning, the NTFS file system can be considered as a hierarchical tree structure, with the disk volume at the top level and with each folder being a branch off the tree. Each folder can have any number of files and these files can be considered as leaf nodes. i.e. there can be no further branches off that leaf node. Folders are therefore referred to as Containers, ie objects that can contain other objects.

So, how exactly is access to these sets of hierarchical objects controlled exactly? That is what we will talk about next. When the NTFS file system was originally introduced in Windows NT, the security permissions framework had major shortcomings. This was revamped in Windows 2000 onwards and is the basis of almost all the file permission security functionality present in modern day Windows OS.

To begin, each object in the file hierarchy has a Security Descriptor associated with it. You can consider Security Descriptors as an extended attribute to the file or folder. Note that Security Descriptors are not only limited to files but also apply to other OS level objects like Processes, Threads, Registry keys etc.
At the basic level, a security descriptor contains a bunch of flags in the Header, along with the Owner information as well as the Primary Group information, followed by a set of variable lists called a Discretionary Access Control List (DACL) as well as a System Access Control List (SACL).

Any File or Folder will always have an associated Owner associated with it and no matter what, that Owner can always perform operations to it. The Primary Group is just enabled for compatibility with POSIX standards and can be ignored. The SACL is for specifying which users and groups get audited for which actions performed on the object. For the purposes of this discussion, let’s ignore that list as well.

The DACL is the most interesting section of any Security Descriptor. You can consider a DACL to define a list of users and groups that are allowed to or denied access to that file or folder. So to represent each user or group with the specific allowed or denied action, each DACL consists of one or more Access Control Entries (ACE). An Access Control Entry specifies a user or group, what permissions are being allowed or denied and some additional attributes. Here’s an example of a simple DACL.

So far, if you have been following along, this seems pretty straightforward and it mostly is, but the practical way how this is applied to folders introduces complexity especially if you are unclear how the permissions interact with each other.

Inheritance of Security Descriptors

If every object had its own unique copy of the Security Descriptor associated with it, things would be pretty simple but impossible to manage practically. Imagine a file system with thousands of folders used by hundreds of users. Trying to set the permissions on each and every folder individually will simply break down quickly. If you needed to add or modify the permissions on a set of folders, you will have to individually apply the change to each and every file and folder in those set of folders.

Thus was born the notion of inheritance. Now, not only is it possible to apply a permission (ACE) to a folder, it is also possible to indicate if the permissions should “flow” to all children objects. So if it is a folder, all subfolders and files inside that folder should have the same permissions “inherited”. See below for an example:

Here, when Folder 3 has some permissions setup, these permissions by default are inherited by its children objects which includes SubFolder1, SubFolder2 and so on. Note that this inheritance is automatic, ie if new objects are added to this section of the tree, those objects automatically include the inherited permissions.

The DACL of any subfolder item now looks like the following, assuming this was set as the permissions for Folder 3.

You can see inherited permissions on any security dialog by the grayed out options seen. To edit these options, you have to traverse up to the tree till you reach the object where the items are actually setup (which in this case is Folder 3, where you can actually edit the permissions). Note that if you ever edit the permissions in Folder3, the new permissions automatically re-flow to the child objects without you having to set them one by one explicitly.

So if inheritance is such a cool thing, why would you ever want to disable inheritance? That’s a good question and it brings us to setting up folder permissions for a large organization. In many organizations which have groups and departments, it is pretty common to organize the folders by groups and then just allow permissions to the folders based on the groups the users belong to.

In most cases, this kind of simple organization works fine, however, in some cases, there will be some folders that belong to a group or department which absolutely need complete security and should only be accessed by a select handful of people. For example: consider SubFolder1 as a highly sensitive folder that should be fully locked down.

In this case this subset of folders should be setup without inheritance.

Disabling Inheritance at Sub Folder 1 helps change a few things. Permission changes happening at parent folders like Folder 3 will never affect Sub Folder 1 under any conditions. It is impossible to give access to Sub Folder1, by someone adding a user or group at Folder 3. This now effectively isolates the SubFolder 1 into its own permission hierarchy disconnected from the rest of the system. So IT admins can setup a small handful of specific permissions for Sub Folder1 that are applicable to all the contents inside it.

Order of Permission Evaluation

Having understood Security Descriptors and Inheritance (as well as when inheritance should be disabled), now it is time to look at how all this comes together. What happens when mutually exclusive permissions are applicable to a file or folder, how does the security permissions remain consistent in that case?

For example, consider an object (File 1) where at a parent level folder (Folder 3), say JOHN is allowed to READ and WRITE a folder and these permissions are inherited to a child object in the hierarchy.

Now, if JOHN is not supposed to WRITE to this child item and you as an IT admin add DENY WRITE to JOHN to the File1 item how does this conflicting permissions make sense and get applied?

The rules are pretty simple in this case, the order of ACE evaluation is
• Direct Deny or Disallowed Permission Entries applied directly on the object
• Direct Allowed Permission Entries applied directly on the object
• Inherited Negative Permission Entries from Parent Objects
• Inherited Parent Permission Entries from Parent Objects

The Windows OS will always evaluate the permissions based on this order, so any overrides placed on that object directly or explicitly will be considered first before any inherited permissions. The first rule that denies permissions for a user under consideration is applied and evaluation is stopped, otherwise evaluation continues till required permissions are allowed and then evaluation is stopped. Note that even with inherited permission entries, the permission entries from the nearest parent are evaluated first before the evaluation continues to the farther parent. ie the distance from the child to the parent matters in the evaluation of the permissions.

Applying Folder Security inside the network

If you thought that setting up permissions on folders is all that you need for a network share you are mistaken, you also need to create a folder share and specify permissions for the share. The final permissions for a user is a combination of the permissions applied in the share as well as the security permissions applied to the folders. The minimum permissions applicable is always applied.

So it is always the best practice to create a share and choose Everyone Full access at the share level and let all the permissions be managed by the Security permissions.

Applying NTFS folder security outside the network

It is simple to provide network folder access over the LAN and apply these folder permissions efficiently, however if you want to allow users to access these files outside the LAN via the web browser, mobile apps etc and still enforce NTFS file and folder permissions, then consider using FileCloud (our Enterprise File Sharing and Sync product) that can effortlessly enforce these permissions and still provide seamless access.

Try FileCloud for Free!

Types of Controls to Manage Your Business Data in an EFSS

EFSS Data Controls

In 2015, there were 38% more security incidents than 2014, and an average cost per stolen record – containing sensitive and confidential data – of $154 (the healthcare industry payed the most, at $363 per record). Worse still, even when 52% of IT professionals felt that a successful cyber-attack against their network would take place in the year, only 29% of SMBs (fewer than 2014), used standard tools like patching and configuration to prevent these attacks.

The consequences of poor data security and data breaches in the cloud cannot be overstated. A look at these statistics shows the effect of data insecurity and data breaches in the cloud are a road that no business wants to take. All the aforementioned statistics show the lack of control of data in the cloud, so we will first look at who controls data in the cloud, followed by how to manage business data in an EFSS.

Who controls data in the Cloud?

It is clear that your IT department does not know who controls data in the cloud, as revealed by participants of a Perspecsys survey on data control in the cloud. According to the results, 48% of IT professionals don’t trust that cloud providers will protect their data, and 57% are not certain of where sensitive data is stored in the cloud.

This issue is also closely tied to data ownership. Once data ownership changes, then we expect a change in the level of control users have on their data . To quote Dan Gray on the concept of data ownership: “Ownership is dependent on the nature of data, and where it was created”. Data created by a user before uploading to the cloud may be subjected to copyright laws, while data created in the cloud changes the whole concept of data ownership. It is no wonder that there is confusion on this matter.

Despite challenges such as half or no control of data stored in the cloud, there exist techniques that we can use to control business data in an EFSS, consequently preventing unauthorized access and security breaches.

Types of data control for business data in an EFSS

 

Input validation controls

Validation control is important because it ensures that all data fed into a system or application is accurate, complete and reasonable. One essential area of validation control is supplier assessment. For example, is a supplier well equipped to meet a client’s expectations? With regards to controls to ascertain data integrity, security and compliance with industry regulations as well as client policies. This activity is best carried out using an offsite audit in the form of questionnaires. By determining the supplier system life-cycle processes, your team can decide if the EFSS vendor is worthy of further consideration or not. Additionally, the questionnaire serves as a basis to decide whether an on-site assessment will be carried out, based on the risk assessment. If carried out, the scope of an onsite audit will be dependent on the type of service an EFSS vendor provides.

Service level agreements should also be assessed and analyzed to define expectations of both an EFSS vendor and user. Usually, this is also done to ensure that the service rendered is in line with industry regulations. Additionally, we must ensure that an EFSS provider includes the following in the service level agreement.

  • Security
  • Backup and recovery
  • Incident management
  • Incident reporting
  • Testing
  • Quality of service rendered
  • Qualified personnel
  • Alert and escalation procedures
  • Clear documentation on data ownership as well as vendor and client responsibilities
  • Expectations with regards to performance monitoring

Processing controls

Processing control ensures that data is completely and accurately processed in an application, via regular monitoring of models and looking at system results when processing. If this is not done, small changes in equipment caused by age or damage will result in a bad model, which will be reflected as wrong control moves for the process.

Include backup and recovery controls

Processing control ensures that data is completely and accurately processed in an application, via regular monitoring of models as well as looking at system results when processing. If this is not done, small changes in equipment caused by age or damage will result in a bad model, which will be reflected as wrong control moves for the process.

Identity and access management

Usually, Identity and Access Management (IAM) allows cloud administrators to authorize personnel who can take action on specific resources, giving cloud users control and visibility required to manage cloud resources. Although this seems simple, advancement in technology has complicated the process of authentication, authorization and access control in the cloud.

In previous years, IAM was easier to handle because employees had to log into one desktop computer in the office to access any information in the internet. Currently, Microsoft’s Active Directory and Lightweight Directory Access Protocol (LDAP) are insufficient IAM tools. User access and control has to be extended from desktop computers to personal mobile devices, posing a challenge to IT. For example, as stated in a Forrester Research report, personal tablets and mobile devices are being used in 65% of organizations, as 30% of employees provision their own software on these devices for use at work, without IT’s approval. It is no wonder that Gartner predicted in 2013 that Identity and Access Management in the cloud would be one of the sought-after services in cloud-based models, in just 2 years.

With this understanding, it is important to create effective IAM without losing control of internally provisioned resources and applications. By using threat-aware identity and access management capabilities, it should be clear who is doing what, what their role is and what they are trying to access.  Additionally, user identities, including external identities, must be tied to back-end directories, and sign-on capabilities must be single because multiple passwords tend to lead to insecure password management practices.

Conclusion

Simple assurance by an EFSS vendor that you have control of business data in the cloud is not enough. There are certain techniques that should be employed to make sure that you have a significant level of data control. As discussed, ensure that you have an effective identity and access management system, have processing and validation controls as well as business data backup and recovery options in place. Other important controls that we have not discussed include file controls, data destruction controls and change management controls.

Author:Davis Porter

Image Courtesy: jannoon028,freedigitalphotos.net