Archive for the ‘data governance’ Category

Content Services Platform vs ECM – Concepts, Overview and Capabilities

Gartner, the leading research and advisory firm, has replaced the term “Enterprise Content Management (ECM)” with “Content Services Platform (CSP)” in its popular research report “Reinventing ECM: Introducing Content Services Platforms and Applications” published in Dec, 2016. Gartner feels the term “Enterprise Content Management” no longer reflects organizational needs for content in business and encourages organizations to rethink their content strategy.

What are the reasons for this change – ECM to Content Services Platform?

Traditional ECM systems has not live up to its original promise of bringing all enterprise content into one repository. The utopian concept of single repo for all enterprise data has not happened and it is unlikely to happen in the future. Infact what has happened is an increase in number of data silos in enterprises because of the advent and use of new SaaS productivity apps. Traditional ECM systems from Alfresco, OpenText and Documentum has fulfilled to some extent the goals of compliance and control. But they have failed to provide the user experiences that end users want (any device and anywhere access) and they come short when it comes to solving new functionalities like enterprise file sharing and sync, group communication, team collaboration and others.

What is Content Services Platform?

Gartner analysts define Content Services Platform as a “a set of services and micro services, embodied either as an integrated product suite or as separate applications that share common APIs and repositories, to exploit diverse content types and to serve multiple constituencies and numerous use cases across an organization.”

If you want a simple definition Content Services Platform are nothing but “an API centric, cloud/device-agnostic next generation enterprise content management systems that support multiple repositories, endpoints, content types and business use cases to serve multiple stakeholders across an organization.

Products like FileCloud, M-Files, Box and Hyland Onbase fulfill the Gartner definition and can be called as Content Services Platforms.

Core Features of Content Services Platform

1. While traditional ECM systems support a single repository, Content Services Platforms support external content repositories in addition to its its primary repository. For instance please see FileCloud architecture given below,

In-addition to its primary repository (Managed Storage), FileCloud supports external repositories (Network shares, AWS S3 and Azure Blob storage). A traditional ECM architecture is shown below that supports a single, primary repository.

2. Content Service Platforms are API centric. All clients use the common APIs to access the content from the repositories. For instance, all FileCloud clients (Sync, Drive, Web, Outlook Add-in, Mobile apps) use the same REST APIs to access the content

3. Compared to traditional ECM architecture, Content Service Platforms offer intuitive user interfaces and excellent UX to appeal to business users. In-addition they provide flexible architecture. Not a monolithic one .

4. Content Service Platforms offer multiple endpoint access to the content managed by CSP. For instance, FileCloud offers multiple clients (Drive, Sync, Web, Mobile apps, Browser add-ons, Salesforce integration, Outlook add-on and so on) to access the content.

5. Content Service Platforms offer integrations with popular, common line of business applications like Salesforce, SAP and others.

6. An Ideal Content Service Platform is cloud agnostic and supports public, private and hybrid cloud storage. For example FileCloud can be deployed on-premise or on public cloud infrastructure and also available as SaaS.

7. Content Service Platforms support content governance to be compliant with regulatory and organizational mandates.

8. Content Service Platforms offer powerful data leak prevention capabilities to secure and manage enterprise content. It shall also offer granular folder, sub folder level access permissions for granular access control.

9. Content Service Platforms offer flexible metadata management and enables auto classification of content to organize and secure content.

10. Content Service Platforms provide an array of content management capabilities that include versioning, document preview, annotation and editing.

New FileCloud 19.2 Release – Smart DLP, Content Classification, Azure Blob Storage Support and More

Data privacy regulatory landscape is changing fast. More and more countries are enacting their own data privacy regulations similar to GDPR. Organizations are facing a new reality where they need to comply with an array of data security privacy regulations (GDPR, CCPA, UK’s Data Protection Act 2018, and NZ Privacy Act) in a short period of time. Keeping the evolving future in mind, We are bringing major new FileCloud functionalities like Smart DLP, Smart Classification and SIEM Integration to make your content compliance and governance easy.

Smart DLP

Our simple, flexible, rule-driven Smart DLP system securely prevents data leaks from end users and can save enterprises from huge compliance fines.

Smart DLP Rules

Smart Classification

Our Smart Classification engine automates your PII/PHI/PCI Discovery. Find personally identifiable information (PII), protected health information (PHI), payment card information (PCI) and other sensitive content quickly.

Content Classification Rules

SIEM Integration

FileCloud now integrates with enterprise Security Information and Event Management (SIEM) tools. This new capability allows system administrators to monitor FileCloud alerts and audit events (What, When, Who and How) in one central place for ease of security management and complete protection.

Azure Blob Storage Support

Our vision is to make FileCloud – the most powerful cloud-agnostic enterprise file services platform that allows organizations to access, share, sync, search and govern organization data. This new release brings Azure Blob storage support to FileCloud making Azure as a first class citizen in our platform. It is a significant milestone in reaching our vision. With this new integration, FileCloud allows organizations to access, share, sync, search and govern organization content stored in Azure.  This integration will greatly benefit organizations in Azure ecosystem and companies who rely on Microsoft products and services.

In addition to the major features, the release includes hundreds of product improvements and security fixes. You can find complete release notes here.

Customer-centricity is one of our core values and our product roadmaps are driven by your requests. As such, we ask that you keep your feedback coming. We’re very proud of this update, and we hope that it comes to serve you well.

How to Successfully Deploy a Data Compliance Solution

According to Gemalto’s 2018 Data Security Confidence Index, 65 percent of companies hold more data than they can handle. Even more concerning is the fact that over 54 percent don’t know where all the sensitive data is stored, while 68 percent have no idea what must be done to maintain GDPR compliance.

That’s the thing about data. In the words of a certain web-slinging superhero, “With great power comes great responsibility.” So, the more data a company has, the more responsibilities it will have in terms of storage, sharing, protection, and usage. And in the wake of the Facebook-Cambridge Analytica data scandal, it is clear that companies will suffer severe reputational damage if they fail to protect confidential information.

Apart from that, a company’s unethical or careless actions will draw severe financial penalties.

A comprehensive data loss prevention strategy looks at containing leaks caused by insider threats, extrusion by attackers and unintentional or negligent data exposure.

In the past few years, the number and complexity of regulations businesses need to comply with have increased considerably as authorities try to regain control over vast amounts of data stored in the cloud and on servers worldwide.

All these factors make data compliance a necessity for companies everywhere. But deploying a compliance solution is more complex than you think. GDPR, HIPAA, PCI DSS, and other regulations have compliance professionals scrambling to comply with the different laws. The trick is to streamline your compliance efforts so you can avoid the fines.

GDPR

The General Data Protection Regulation (GDPR) came into effect in 2018 across the European Union. It lays out different rules concerning an individual’s right to know what data companies have on them, how they should process this data, and stricter measures for reporting breaches.

The thing is, the GDPR didn’t just affect businesses based in Europe. Cisco’s 2019 Data Privacy Benchmark Study surveyed more than 3200 security professionals in 18 countries across different industries, and 97 percent of respondents claimed GDPR applied to their firms. If a company has dealings with any individual under the EU’s jurisdiction, they must abide by the provisions laid down by this new regulation.

Even though there are plenty of rules within the GDPR, most of them revolve around three major principles – reducing the amount of data held, acquiring consent, and ensuring a data subject’s rights.

What Does Deployment of GDPR Compliance Involve?

While it might seem like a huge leap, the first step for any business to ensure GDPR compliance is to assign someone who will oversee the company’s activities. This person is the data protection officer. Certain organizations dealing with huge volumes of data have already made this role mandatory in their structure.

Role of Data Protection Officer:

  • Data protection officers oversee data protection strategies and implementation for compliance with GDPR guidelines.
  • They must document why people’s information is collected and processed, the timeline and descriptions of the data held, and information on technical security measures.
  • They report to senior staff members and are a point of contact for customers and employees.


Process of Deployment

Start by centralizing your GDPR compliance. Quickly implement the new standard required to achieve GDPR compliance, such as data protection and storage requirements (Article 35), responding to breaches (Article 33) and requests for removal.Monitor the company’s hierarchies of personal information, and maintain necessary controls and records for processing activities (Article 30). Develop a holistic asset inventory and provide a central landing page for customers so they can submit individual rights requests.

You must now integrate GDPR with existing processes by customizing GDPR requirements and ensuring they align with the needs of the organization. Implement a powerful graph database and design approval workflows. Also, you need to set automated due date triggers so authorities get alerted within the stipulated 72 hours of a data breach.

HIPAA

Out of the 3,003 healthcare institutions surveyed by medRxiv, more than half failed to comply with the Health Insurance Portability and Accountability Act (HIPAA) right of access. The study shows that most patient requests take numerous referrals or attempts before supervisors share the records.

This 1996 regulation governs how American organizations handle an individual’s medical and healthcare records in a confidential and safe manner. Due to the sensitive nature of these records, organizations must pay hefty penalties if they fail to safeguard the data. Insurance provider Anthem, for example, paid $16 million in fines last year after the health information of nearly 79 million individuals was hacked.

What Does Deployment of HIPAA Compliance Involve?

As per HIPAA guidelines, all electronic health data are limited to those with valid reasons for viewing them. Thus, strong access controls and encryption are necessary. The standards are applicable to records in a database setting and those being shared. It is important to fully monitor, protect, and control file transfers and emails.

HIPAA involves complete audit trails that detail each interaction with the data. Healthcare institutions must therefore equip IT staff with event log management software so they can comply with these regulations. Not only does the software maintain full records each time a file is changed or accessed, it alerts organizations to potential security breaches the moment they happen.

PCI DSS

Every business dealing with the financial data of customers is aware of the Payment Card Industry Data Security Standard (PCI DSS). This is integral to a financial institution’s compliance method since it establishes the guidelines on how firms must protect and handle cardholder data, like credit card numbers.

Now, no government body mandates PCI DSS. But it is wholly accepted by different industries and non-compliant companies may have to shell out heavy fines. In fact, their relationships with payment processors or banks may get terminated. Even companies using third-party services for processing card payments should take responsibility for the security of debit or credit card data gathered, stored, or terminated.

What Does Deployment of PCI DSS Compliance Involve?

The precise methods firms must follow depend on the number of transactions processed. Companies with bigger customer bases must adhere to better requirements. However, PCI DSS standards require businesses to maintain stringent security standards.

Thankfully, the Payment Card Industry Security Standards Council details several steps on what companies should do for compliance. The twelve-step process ranges from sufficient firewall for cardholder data protection (requirement 1) to regular testing of processes and systems (requirement 11). Thus, companies must devise a plan to meet these standards.

Concluding Remarks

With the increasing complexity of business processes and regulations, any mishandling of customer data or files increases the likelihood of regulatory penalties. Not only does this threaten the reputation of a business but it also incurs severe financial penalties. Compliance is necessary for companies to establish policies that meet industry expectations.

Sources:

https://www.goanywhere.com/resource-center/compliance

https://www.syncplicity.com/en/products/data-protection

Demystifying the Complexities of Data Ownership

Enterprises are undergoing a digital transformation as they continue to explore new opportunities offered by connected technologies. However, they are also becoming increasingly reliant on data and driven by data gathering and analytics. The risks to sensitive data are expanding, leading to multiple questions relating to data rights and privacy that have to be unraveled. The fact that data is driving innovation is undeniable, innovators require very large quantities of data from a broad array of sources to push the envelope on emerging technologies like machine learning and AI.

In this digital age, data is collected ubiquitously. Personal data is collected each time you interact online, use a mobile device, via IoT devices in vehicles and homes, and from the various public and private sector services we utilize on a day to day basis. Due to this, data ownership can no longer be considered a niche issue. Enterprises are realizing that data ownership is gaining strategic importance.

What is Data Ownership?

Data ownership boils down to how the data was created and who created it. However, getting the precise definition of data ownership is not straight forward, the term itself is typically misleading. This fact is rooted in the basic concept of ‘ownership’, which can be construed as having legal title and full property rights to something. Going by that definition of ownership, then data ownership must mean having legal title to one or more specific articles of data. In reality, while the actual ‘owner’ of the data is responsible for the entire domain, it’s typically different people who ensure that all the details are accurate and updated.

Who Actually Owns the Data?

Is it the physical individual associated with the personal data, or is it the organization that has invested money and time in the collection, storage, processing, and analysis of the personal data. In an enterprise setting, the term ‘ownership’ generally assigns a level of accountability and responsibility for specific datasets. In this context, the ‘ownership’ bares no legal connotation but refers to other notions like assurance of data security and data quality. From a more legal standpoint, ownership-like rights are currently limited to trade secrets and intellectual property rights. However, none of them provide adequate protection of (ownership in) data.

Most legal professionals are of the opinion that data subjects should keep the ownership of the raw data they provide, while the data processor retains the ownership of ‘constructed data’ – obtained via manipulating the original data, and that can’t be reverse-engineered to extrapolate the raw data. The properties of data itself makes ownership an arduous proposition. Knowing this, regulators have instead chosen to enact simple restrictions on the use of data as opposed to labeling data as an individual’s property.

How Does Technologies Like Machine Learning Affect Data Ownership?

From voice assistants like Alexa and Siri, to self-driving cars, it’s no secret that Artificial Intelligence has come full circle in the last couple of years – largely due to big data and the advancements in computing required to process information and train machine learning systems. But even as we marvel at these technological advancements being driven by data, we cannot fail to consider how data ownership impacts both privacy, and machine learning initiatives.

It’s no secret that data ownership is slowly being solidified by the expansion of data democratization. Paradoxically, the democratization of data and the continuous iteration and development of machine learning applications muddles the concept of data ownership. Enterprises derive invaluable insights from machine-learning-driven models that utilize consumer data. From a data-ownership perspective, the trouble stems from the exact same point as the opportunity.

Creators of machine learning technologies should therefore resolve to integrate organizational and technical measures that implement data protection principles into the design of AI-based tools. Additionally, when it comes to data ownership and artificial intelligence, legal practitioners should remain circumspect to the reality that propriety rights in certain aspects of data may exist. In the context of AI, propriety rights may not protect the data itself, but its compilation, which may include database rights and copyrights regarding the ‘products’ of AI.

Data Governance Within the Enterprise

Data governance refers to the general management of the integrity, usability, security, and availability of the data utilized in the enterprise. The unparalleled rise in sources and volume of data has requisitioned enhanced data management practices for enterprises. Quality, governed data is crucial to effective and felicitous decision making. It’s essential for guaranteeing legal, regulatory and financial compliance.  The first step towards establishing a successful data governance process is clearly defining data for an enterprise-level integration. This lays the ground work for a complete audit trail of who did what to which data, making it simpler for the organization to trace if/where something went wrong. Data stewards should be appointed as part of the governance process, to oversee the entire framework.

Data privacy regulations like CCPA and GDPR have increased the need for enterprise-wide regulatory compliance. A well developed data governance framework facilitates several aspects of regulatory compliance, empowering business to readily classify data and perform process mapping and risk analysis.

Author: Gabriel Lando

The Evolution of Data Protection

Data has penetrated every facet of our lives. It has evolved from an imperative procedural function into an intrinsic component of modern society. This transformative eminence has introduced an expectation of responsibility on data processors, data subjects and data controllers who have to respect the inherent values of data protection law. As privacy rights continually evolve, regulators are faced with the challenge of identifying how best to protect data in the future. While data protection and privacy are closely interconnected, there are distinct differences between the two. To sum it up, while data protection is about securing data from unauthorized access, data privacy is about authorized access – who defines it and who has it. Essentially, data protection is a technical issue whereas data privacy is a legal one. For industries that are required to meet compliance standards, there are indispensable legal implications associated with privacy laws. And guaranteeing data protection may not comply with every stipulated compliance standard.

Data protection law has undergone its own evolution. Instituted in the 1960s and 70s in response to the rising use of computing, re-enlivened in the 90s to handle the trade of personal information, data protection is becoming more complex. In the present age, the relative influence and importance of information privacy to cultural utility can’t be understated. New challenges are constantly emerging in the form of new business models, technologies, services and systems that increasingly rely on ‘Big Data’, analytics, AI and profiling. The environments and spaces we occupy and pass through generate and collect data.

Technology enthusiasts have been adopting new data management techniques such as ETL (Extract, Transform, and Load). ETL is a data warehousing process that uses batch processing and helps business users analyze data which is relevant to their business objectives. There are many ETL tools which manage large volumes of data from multiple data sources, manage migration between multiple databases and easily load data to and from data-marts and data warehouses. ETL tools can also be used to convert (transform) large databases from one format or type to another.

The Limitations of Traditional DLP

Quaint DLP solutions offer little value. Most traditional DLP implementations mainly consist of network appliances designed for primarily looking at gateway egress and ingress points. The cooperate network has evolved; the perimeter has pretty much been dissolved leaving network-only solutions that are full of gaps. Couple that with the dawn of the cloud and the reality that most threats emanate at the endpoint and you understand why traditional, network- appliance only DLP is limited in its effectiveness.

DLP solutions are useful for identifying properly defined content but usually falls short when an administrator is trying to identify other sensitive data, such as intellectual property that might contain schematics, formulas or graphic components. As traditional DLP vendors stay focused on compliance and controlling the insider, progressive DLP solutions are evolving their technologies; both on the endpoint and within the network to enable a complete understanding of the threats that target data.

The data protection criterion has to transform to include a focus on understanding threats irrespective of their source. Demand for data protection within the enterprise is rising as is the variation of threats taxing today’s IT security admins. This transformation demands advanced analytics and enhanced visibility to conclusively identify what the threat is and deliver the versatile controls to appropriately respond, based on business processes and risk tolerance.

Factors Driving the Evolution of Data Protection

Current data protection frameworks have their limitations and new regulatory policies may have to be developed to address emerging data-intensive systems. Protecting privacy in this modern era is crucial to good and effective democratic governance. Some of the factors driving this shift in attitude include;

Regulatory Compliance: Organizations are subject to obligatory compliance standards obtruded by governments. These standards typically specify how businesses should secure Personally Identifiable Information (PII), and other sensitive information.

Intellectual Property: Modern enterprises typically have intangible assets, trade secrets, or other propriety information like business strategies, customer lists, and so on. Losing this type of data can be acutely damaging. DLP solutions should be capable of identifying and safeguarding exigent information assets.

Data visibility: In order to secure sensitive data, organizations must first be aware it exists, where it exists, who is utilizing it and for what purposes.

Data Protection in The Modern Enterprise

As technology continues to evolve and IoT devices become more and more prevalent, several new privacy regulations are being ratified to protect us. In the modern enterprise, you need to keep your data protected, you have to be compliant, you have to constantly be worried about a myriad of like malicious attacks, accidental data leakage, BYOD and much more. Data protection has become essential to the success of the enterprise. Privacy by Design or incorporating data privacy and protection into every IT initiative and project has become the norm.

The potential risks to sensitive corporate data can be as tenuous as the malfunction of small sectors on a disk drive or as broad as the failure of an entire data center. When contriving data protection as part of an IT project, there are multiple considerations an organization has to deal with, beyond selecting which backup and recovery solution they will use. It’s not enough to ‘just’ protect your data – you also have to choose the best way to secure it. The best way to accomplish this in a modern enterprise is to find a solution that delivers intelligent, person-centric and dynamic data-centric fine-grained data protection in an economical and rapidly recoverable way.

Author: Gabriel Lando

Benefits of Centralized Master Data Management

Centralized data management is the preferred choice of most organizations today. With this model, all the important files and even apps are stored on a central computer. Workers can access the data and resources on the central computer through a network connection, virtual desktop infrastructure (VDI) or desktop as a service (DaaS). Many organizations including banks, financial firms, hospitals, and even schools prefer central data management.

If your organization is not using a centralized data management model, you may want to consider it as it gives the company more control and makes things more ordered for workers. In this blog post, we’ll be looking at some of the key benefits of centralized data management.

Security

Security is a top priority for every organization, and it is one of the top benefits of central data storage. In the age of data breach, we cannot overlook the importance of proper security. You do not want sensitive files to fall into the wrong hands.

When data is spread across the different devices of your employees, it is difficult to implement proper security measures across all devices. Even if you lay down security guidelines, there are no guarantees that they’ll be followed. However, central data management ensures that you can take matters into your hands and provide thorough security for your files. You can determine who gets access to them and the level of access each person has. Centralized data management is the best way to provide foolproof security in every organization.

Easier Data Recovery

Sometimes, despite our best efforts, we are faced with the loss of data. It could be because your device is hit with a virus, a software becomes corrupt or even a hardware malfunction. Whatever the case, data recovery an option that you’ll most likely turn to. Of course, you need the right files to serve clients and ensure your organization is running smoothly.

Data recovery is much easier when you have a central data storage. Instead of your IT staff having to go through several devices to recover files and attempt to assemble them, they can focus on working on one device. This doesn’t only make the job of recovering data less complicated, but also more orderly. When you even think about it, the possibility of having to resort to data recovery is less when you opt for central data management. This is because you can implement the best security protocols and maintain the hardware of your central computer, so it never experiences any malware attacks as well as hardware or software failure.

Data Integrity

Another importance of central data management is data integrity. Data integrity refers to the consistency and accuracy of your data. When your files are spread across different devices used by your employees, there is a higher probability for conflicting versions of the same file. For example, if two people are working on different aspects of the same document at the same time. They will present two different versions ultimately, and someone will have to spend precious time compiling them. You also run the risk of having more redundant files on different devices.

Central data management eliminates all of this. When all your files are stored on a master computer, it is easier to spot redundancies as well as conflicting versions of the same document. Central data management ensures that your files are accurate and updated. This makes it easier to access specific information and makes your employees more efficient.

Smooth Collaboration

Collaboration is essential in every organization. It invariably happens that several people have to pitch in to get a job done. With central data management, collaboration is smoother. Instead of having to go around the office or send emails asking other people for particular files, your employees can just log in to the central database and access them. No need to wait because the person who has a specific file is out of the office. This does not only save time but speeds up collaboration.

Central data management also eases the decision-making process and makes it faster. This is because the decision makers can quickly access all the data they need to come to a conclusion. Having a centralized data management system also allows the people at the top tier of the organization to keep track of the activities of everyone and ensure that things are progressing as they should.

Cuts Costs

Although it may not seem obvious, central data management allows organizations to cut costs in different ways. First, your IT staff will have less to deal with if they have to focus on maintaining the central computer in your data center. This means less working hours for them. Also, your organization doesn’t have to cover the cost of purchasing devices for every employee. You can support the bring your own device (BYOD) trend in your workplace. Additionally, you will spend less on power supply and general maintenance costs.

How FileCloud Supports Central Data Management

If you are implementing central data storage in your workplace, FileCloud can help make your work smoother and more efficient. We provide a range of tools to support collaboration, security, data loss prevention and more. Let’s look at some of the things your organization will enjoy by signing up to FileCloud.

A. You can restrict access to certain files and determine who has access to them. You can also access an activity stream which shows who has access to particular files and what they did. FileCloud allows conversations around files so you can let your workers know what to do and they communicate with one another to specify where to pick up the work. What’s more, you can opt to receive smart notifications when a file is changed.

B. FileCloud allows you to manage the files in your central storage. This includes adding metadata tags to files. You can choose to search for files using the metadata or any text in the document.

C. FileCloud also allows you to prevent data loss by restoring deleted files and backing up data. You can also remotely wipe data and block access to compromised devices. FileCloud also provides automatic file versioning. Therefore, if different workers in your organization save new versions of a file at the same time, the app creates different versions automatically to prevent data loss.

This is just a tip of the iceberg of what you enjoy from signing up to FileCloud. We provide everything that your organization needs to make the most of your centralized data storage system, and our prices are very affordable! What more can you ask for?

The Changing Face of Data Governance

In our age of data-driven decision making, the new GDPR laws have once again brought the criticality of data governance to the forefront. Believed to be one of the most extensive revisions to the European data protection and privacy legislation, GDPR and its associated changes have presented businesses with the unique opportunity to organize their data houses.

So, executives should consult with experts familiar with GDPR on its impact on their operations. Businesses need to get used to the idea of handing over control of the data they share with people; only then can they achieve GDPR compliance and establish a better rapport with customers. But how does data governance figure into all this? Find out below:

 

 

Shortcomings in Traditional Data Governance

 

 

There’s nothing wrong with traditional data governance; in fact, it offers a rigorous and strategic framework for designing outline roles, data standards, and responsibilities, along with procedures and policies for data management throughout the organization. What’s more, without traditional data governance, businesses wouldn’t have been able to increase their efficiency and productivity in the use of core business data resources in data and transactional warehousing environments.

The focus of these methods was on data quality, trust, and protection, and they were great for recognized data sources that had known value. However, the modern industry is full of unstructured or unknown data sources like IoT and big data, and traditional data governance just can’t keep up. With the added features of machine learning and artificial intelligence, the shortcomings of the conventional approach are becoming obvious.

Owing to their rigid structure, conventional data governance procedures and policies hinder the possibilities formed by advanced analytics and data technologies by forcing them to fit the age-old mould for legacy infrastructure and data platforms.

 

 

Impact of Emerging Technologies

 

 

IoT provides thousands of unrelated data sources a chance to connect on the same platform. IoT gadgets are more than just data source; they are data generators and gatherers. Sensors, wearable devices, and other modern computing technology can accumulate data by the millisecond and stream the same data into a cloud of possible consumers.

Artificial intelligence and machine learning systems analyze the data in real-time to identify relationships and patterns, gain knowledge, and plan a suitable course of action. While these are data-based autonomous actions rather than explicit instruction or programming, they possess the power to find gaps or extra data requirements and send requests back to the IoT gadgets for collecting or generating fresh data.

Traditional data governance makes the onboarding of IoT devices very difficult because of conventional authorization and validation needs. To foster machine learning and artificial intelligence in these initial stages, the data lifecycle must rely on non-conformity with predefined standards and rules. So, governance must allow new data to be incorporated quickly and efficiently, and offer mechanisms to mitigate dangers, maximize value, and encourage exploration.

 

AI and IoT under the New Data Governance Methods

 

Concepts like IoT and AI aren’t new but they are still highly competitive markets for businesses. While the two undergo expansion, they tend to hypercharge the growing volume of data, especially unstructured data, to unexpected levels. As a result, the volume, velocity, and variety of data increase in unison. And as the volume rises, so does the speed and velocity at which data need to be processed. In such cases, the types of unstructured data increases as well. To manage all this, businesses have to implement the necessary data governance.

Storage and Retention

Big data has increased the variety and volume of data considerably, which means more data storage is a necessity. Data storage and data integration and provisioning are used interchangeably, but they are very distinct. Governance must address them separately and appropriately. While storage normally means the way data is physically retained by the organization, in conventional data management methods, the data storage technology impacts the storage requirements like size and structural limitations. Along with retention practices and budget limitations, often dependent on compliance, these needs restrict the amount of data stored by the business at a certain time.

 

 

 

Security and Privacy

 

 

Security and privacy are the major areas of focus for conventional data governance. But new technologies expand the scope of what needs to be secured and protected, emphasizing the need for additional protection. Even though “privacy” and “security” are thought to be one and the same, they are not.

Security strategies safeguard the integrity, confidentiality, and availability of data created, acquired, and maintained by the company. Security exclusively means protecting data, while privacy is more about protecting entities, like individuals and businesses. Privacy programs make certain that the interests and rights of an individual to control, use, and access their private details are protected and upheld. However, without a successful security strategy, a privacy program is unable to exist. Privacy needs often inform policies in large-scale security operations, but the program itself influences the processes and technology need to implement the necessary controls and protection.

As far as IoT is concerned, security is one of the most crucial aspects. The regular addition of systems and devices constantly leads to new vulnerabilities. Even though business comes first, protection is possible only if they protect and secure the network along with every touch point where data travels. Thanks to IoT, data security isn’t just about permissions and access on a given system. Data protection now incorporates network segmentation, data encryption, data masking, device-to-device authentication, cybersecurity monitoring, and network segmentation. That’s a whole lot more than what traditional governance programs envision.

 

Escalated Digital Transformation

 

The changes in digital transformation will be far-reaching. In fact, the new data governance measures will accelerate the process, thereby rewarding organizations that commit to more than just compliance with data governance. Moreover, a stronger foundation in the field of data governance will provide organizations with various benefits, such as increased operational efficiency, decision-making, improved data understanding, greater revenue, and better data quality.

Data-driven businesses have long enjoyed these advantages, using them to dominate and disrupt their respective industries. But it’s not just meant for large businesses. The moment is right, for your company to de-silo data governance and treat like a strategic operation.

Data governance is changing, and you need to work hard to keep up or get left behind in the industry. However, you can follow the tips given below for the best health and ensure your company is prepared for GDPR.

 

Author : Rahul Sharma

The Changing Face of Data Governance

In our age of data-driven decision making, the new GDPR laws have once again brought the criticality of data governance to the forefront. Believed to be one of the most extensive revisions to the European data protection and privacy legislation, GDPR and its associated changes have presented businesses with the unique opportunity to organize their data houses.

So, executives should consult with experts familiar with GDPR on its impact on their operations. Businesses need to get used to the idea of handing over control of the data they share with people; only then can they achieve GDPR compliance and establish a better rapport with customers. But how does data governance figure into all this? Find out below:

 

Shortcomings in Traditional Data Governance

 

https://upload.wikimedia.org/wikipedia/commons/a/a2/Digital_Transformation.jpg

 

There’s nothing wrong with traditional data governance; in fact, it offers a rigorous and strategic framework for designing outline roles, data standards, and responsibilities, along with procedures and policies for data management throughout the organization. What’s more, without traditional data governance, businesses wouldn’t have been able to increase their efficiency and productivity in the use of core business data resources in data and transactional warehousing environments.

The focus of these methods was on data quality, trust, and protection, and they were great for recognized data sources that had known value. However, the modern industry is full of unstructured or unknown data sources like IoT and big data, and traditional data governance just can’t keep up. With the added features of machine learning and artificial intelligence, the shortcomings of the conventional approach are becoming obvious.

Owing to their rigid structure, conventional data governance procedures and policies hinder the possibilities formed by advanced analytics and data technologies by forcing them to fit the age-old mould for legacy infrastructure and data platforms.

 

Impact of Emerging Technologies

https://upload.wikimedia.org/wikipedia/commons/5/5f/Brain-Controlled_Prosthetic_Arm_2.jpg

 

IoT provides thousands of unrelated data sources a chance to connect on the same platform. IoT gadgets are more than just data source; they are data generators and gatherers. Sensors, wearable devices, and other modern computing technology can accumulate data by the millisecond and stream the same data into a cloud of possible consumers.

Artificial intelligence and machine learning systems analyze the data in real-time to identify relationships and patterns, gain knowledge, and plan a suitable course of action. While these are data-based autonomous actions rather than explicit instruction or programming, they possess the power to find gaps or extra data requirements and send requests back to the IoT gadgets for collecting or generating fresh data.

Traditional data governance makes the onboarding of IoT devices very difficult because of conventional authorization and validation needs. To foster machine learning and artificial intelligence in these initial stages, the data lifecycle must rely on non-conformity with predefined standards and rules. So, governance must allow new data to be incorporated quickly and efficiently, and offer mechanisms to mitigate dangers, maximize value, and encourage exploration.

 

AI and IoT under the New Data Governance Methods

Concepts like IoT and AI aren’t new but they are still highly competitive markets for businesses. While the two undergo expansion, they tend to hypercharge the growing volume of data, especially unstructured data, to unexpected levels. As a result, the volume, velocity, and variety of data increase in unison. And as the volume rises, so does the speed and velocity at which data need to be processed. In such cases, the types of unstructured data increases as well. To manage all this, businesses have to implement the necessary data governance.

Storage and Retention

Big data has increased the variety and volume of data considerably, which means more data storage is a necessity. Data storage and data integration and provisioning are used interchangeably, but they are very distinct. Governance must address them separately and appropriately. While storage normally means the way data is physically retained by the organization, in conventional data management methods, the data storage technology impacts the storage requirements like size and structural limitations. Along with retention practices and budget limitations, often dependent on compliance, these needs restrict the amount of data stored by the business at a certain time.

 

Security and Privacy

https://c1.staticflickr.com/9/8604/16042227002_1d00e0771d_b.jpg

 

Security and privacy are the major areas of focus for conventional data governance. But new technologies expand the scope of what needs to be secured and protected, emphasizing the need for additional protection. Even though “privacy” and “security” are thought to be one and the same, they are not.

Security strategies safeguard the integrity, confidentiality, and availability of data created, acquired, and maintained by the company. Security exclusively means protecting data, while privacy is more about protecting entities, like individuals and businesses. Privacy programs make certain that the interests and rights of an individual to control, use, and access their private details are protected and upheld. However, without a successful security strategy, a privacy program is unable to exist. Privacy needs often inform policies in large-scale security operations, but the program itself influences the processes and technology need to implement the necessary controls and protection.

As far as IoT is concerned, security is one of the most crucial aspects. The regular addition of systems and devices constantly leads to new vulnerabilities. Even though business comes first, protection is possible only if they protect and secure the network along with every touch point where data travels. Thanks to IoT, data security isn’t just about permissions and access on a given system. Data protection now incorporates network segmentation, data encryption, data masking, device-to-device authentication, cybersecurity monitoring, and network segmentation. That’s a whole lot more than what traditional governance programs envision.

 

Escalated Digital Transformation

https://www.mojix.com/wp-content/uploads/2018/01/digital-transformation.jpg

 

The changes in digital transformation will be far-reaching. In fact, the new data governance measures will accelerate the process, thereby rewarding organizations that commit to more than just compliance with data governance. Moreover, a stronger foundation in the field of data governance will provide organizations with various benefits, such as increased operational efficiency, decision-making, improved data understanding, greater revenue, and better data quality.

Data-driven businesses have long enjoyed these advantages, using them to dominate and disrupt their respective industries. But it’s not just meant for large businesses. The moment is right, for your company to de-silo data governance and treat like a strategic operation.

Data governance is changing, and you need to work hard to keep up or get left behind in the industry. However, you can follow the tips given below for the best health and ensure your company is prepared for GDPR.

 

 

 

Author – Rahul Sharma

Data Protection Officers’ (DPO) Role in Overseeing GDPR Compliance

The five-year process since the passage of the General Data Protection Regulation (GDPR) is soon coming to an end, and organizations across the globe are now clamoring to prepare for the several new requirements surrounding data collection and processing. One requirement in particular has been arguably the most debated and amended provision throughout the legislative process of the GDPR; this obligation calls for staffing, something that is yet to be seen in European law outside Germany – specific organizations will have to employ, appoint or contract a designated data protection officer (DPO) by the time the regulation comes into force in May 2018.

The GDPR has made it mandatory for any organization that controls or processes large volumes of personal data, including public bodies – with the exemption of courts, to appoint a DPO. This requirement is not limited to large organizations; GDPR states that as long as the activities of the processor or controller involves the ‘systematic and regular monitoring of data subjects on a grand scale’ or where the entity carries out large scale processing of particular categories of personal data such as data that details things like race, religious beliefs, or ethnicity, they must comply to this requirement. This basically means that even sole trades who handle certain types of data may have to hire a DPO.

A DPO may be appointed to act on behalf of a group of public authorities or companies, depending on the size and structure. A scope in the regulation allows for EU member states to specify additional circumstances for the appointment of a DPO. For example, in Germany, every business with more than nine employees and permanently process personal data have to appoint a DPO.

What is a DPO?

A data protection officer is a position within an organization that independently advocates for the responsible use and protection of personal data. The role of Data Protection Officer is a fundamental part of the GDPR’s accountability-based approach. The GDPR necessitates that a DPO is responsible for liaising with end-users and customers on any privacy related requests, liaising with the various data protection authorities, and ensuring that employees remain informed on any updates regarding data protection requirements. The DPO has to have expert knowledge of data protection practices and laws, on top of having a solid understanding of the company’s organizational and technical structure.

The Controller and the processor shall ensure that the data protection officer is involved, properly and in a timely manner, in all issues which relate to the protection of personal data.

Article 39, GDPR

What is the Role of the DPO?

The DPO is mainly responsible for ensuring a company is compliant with the aims of the GDPR and other data protection policies and laws. This responsibility may include setting rational retention periods for personal data, developing workflows that facilitate access to data, clearly outlining how collected data is anonymized, and monitoring all these systems to make sure they work towards the protection of personal data. Additionally, the DPO should also be available for inquiries on issues related to data protection practices, the right to be forgotten, withdrawal of consent and other related rights the GDPR grants them.

The GDPR affords the data protection officer several rights on top of their responsibilities. The company is obligated to provide any resources the DPO asks for, or requires to fulfill their role and ongoing training. They should have full access to the company’s data processing operations and personnel, a considerable amount of independence in the execution of their tasks, and a direct reporting line to the highest level of management within the company. GDPR expressly prevents the dismissal of the data protection officer for executing their tasks and puts no limitation on the length of their tenure.

How Does DPO Differ from Other Security Roles?

Currently most organizations already have Chief Information Officer (CIO), Chief Data Officer, or CISO roles; however, these roles are different from the DPO role. The holders of these positions are typically responsible for ensuring the company’s data is safe, and ensuring the data a company collects is being used to enhance business processes across the organization. The DPO on the other hand mainly works to ensure the customer’s privacy. This means that instead of retaining ‘valuable’ data indefinitely, or exploiting insights collected in one business line to imbue another; the DPO is there to make sure only the minimum data required to complete a process is collected and subsequently retained. GDPR creates a huge demand for DPOs, but the job itself is far from easy.

Who Qualifies to be DPO?

The GDPR doesn’t specify the exact credentials a DPO should have. The role in itself is somewhat multi-faceted, in the sense that advising on obligations under GDPR is a legal role, while monitoring compliance falls under audit. Additionally, the data protection impact assessment is more of a privacy specialist role, and the working closely with a supervisory authority demands an understanding of how the authority works.

I. Level of expertise – A DPO must poses a level of expertise that is comparable to the complexity, sensitivity and volume of data the company processes. A high level understanding of how to develop, implement and manage data protection programs is crucial. These skills should be founded upon a vast-ranging experience in IT. The DPO should also be able to demonstrate an awareness of evolving threats and fully comprehend how modern technologies can be used to avert these threats.
II. Professional qualities – A DPO doesn’t have to be a lawyer, but they have to be experts in European and national data protection law, this includes an in-depth knowledge of the GDPR. They should be able to act in an independent manner. This points towards the need for a mature professional who can build client relationships while ensuring compliance without taking an adversarial position.
III. Ability to execute tasks – The DPO has to be able to demonstrate high professional ethics, integrity, leadership and project management experience; to be able to request, mobilize and lead the resources needed to fulfill their roles.

Best Practices for ITAR Compliance in the Cloud

The cloud has become part and parcel of todays Enterprise. However, remaining compliant with the International Traffic in Arms regulation (ITAR) demands extensive data management aptness. Most of the regulatory details covered by ITAR aim to guarantee that an organization’s materials and information regarding military and defense technologies on the US munitions list (USML) is only shared within the US, with US authorized entities. While this may seem like a simple precept, in practice, attaining it can be extremely difficult for most companies. Defense contractors and other organizations that primarily handle ITAR controlled technical data have been unable to collaborate on projects while utilizing cloud computing practices that have a proven track record fostering high performance and productivity. Nevertheless, the hurdles impeding the productivity opportunities of the cloud can be overcome. Practices that govern the processing and storage of export controlled technical data are evolving.

Full ITAR compliance in the cloud is not an end result, but a continual odyssey in protecting information assets. In the long run, being ITAR compliant boils down to having a solid data security strategy and defensive technology execution in place.

Utilize End-to-End Encryption

On September 2016, the DDTC published a rule that established a ‘carve out’ for the transmission of export controlled software and technology within a cloud service infrastructure, necessitating the ‘end-to-end’ encryption of data. The proviso is that the data has to be encrypted before it crosses any boarder, and has to remain encrypted at all times during transmission. Likewise, any technical data potentially accessed by a non-US person outside or within the United States has to be encrypted ‘end-to-end’; which the rule delineates as the provision of continual cryptographic protection of data between the originator and the intended recipient. In a nutshell, the mechanism of decrypting the data can’t be given to a third party before it reaches the recipient.

The native encryption of data at rest offered by most cloud providers fails to meet the definition of end-to-end encryption, because the cloud provider likely has access to both the encryption key and data. The cloud provider inadvertently has the ability to access export controlled information. Organizations have to ensure that DDTC definition of ‘end-to-end’ encryption is met before storing their technical data in a public or private cloud environment. Otherwise they will be in violation of ITAR.

Classify Data Accordingly

Most technologies are not limited to single use. Whenever an organization that handles technical data related to defense articles shares information regarding a service or product; steps have to to be taken to make sure that any ITAR controlled data is carefully purged in its entirety. Classification entails reviewing existing business activities and contracts to establish if they fall under ITAR. The process requires a good understanding of licensing terms, court interpretations, agency directives and other guidance. In order to successfully navigate the nuances and complexities of ITAR, organizations have to collect enough metadata to catalog, separate and classify information. For easy identification, the data should be classified into categories such as ‘Public Use’, ‘Confidential’, and ‘Internal Use Only’. Classifying data is a requisite to creating a full-proof Data Leakage Prevention (DLP) implementation.

Develop a Data Leak Prevention (DLP) Strategy

Accidental leaks owing to user error and other oversights occur more often that most would care to admit. Mistakes that can happen, will happen. Establishing a set of stringent policies to obviate users from mishandling data, whether fortuitously or intentionally is crucial to ITAR compliance. Organizations should have a strategy in place to guarantee the continual flow of data across their supply chains, while protecting said data from the following employee scenarios:
Well meaning insiders – employees who makes an innocent mistake.
Malicious insiders – employees with ill intention
Malicious Outsiders – individuals looking to commit cooperate espionage, hackers, enemy states, and competitors among others.

Control Access to Technical Data

Access control is well known technique that is used to regulate who can view or use the resources in a computing environment. Access control can be employed on a logical or physical level. Physical access control restricts access to physical areas and IT assets. Logical access control allows IT administrators to establish who is accessing information, what information they are accessing and where they are accessing it from. Roles, permissions are security restrictions should be established before hand to ensure that only authorized U.S persons have access to export controlled technical information. Multifactor authentication strengthens access control by making it extremely difficult for unauthorized individuals to access ITAR controlled information by compromising an employees access details.

Establish Security Policies and Train the Staff Well

An ITAR specific security stratagem is the corner stone of data security practices. The policies should handle network and physical security considerations. ITAR is riddled with complications that make it easy for organizations to make mistakes if they don’t remain keen. The organization is as secure as it’s weakest link, in most cases it’s usually the staff. A solid security policy on paper simply does not cut it. Without proper staff training, a compliance strategy will be largely ineffective since it doesn’t tie in with the actual organizational procedures. Investing in end-user training is the only way to ensure security policies are implemented.

In Closing

Organizations have turned to government clouds to manage the complex regulatory issues associated with the cloud. Platforms like AWS Gov Cloud has developed substantial capabilities that enable organizations subject to ITAR to effectuate robust document management and access control solutions. When paired with FileCloud organizations can build and operate document and information management systems that satisfy the strictest security and compliance requirements.

 

Author : Gabriel Lando