Archive for the ‘Admin Tools and Tips’ Category

FileCloud Aurora – All About Single-Click Actions


In November 2020, FileCloud released update 20.2 – a complete rehaul of our Sync, Mobile and browser UI and functionalities. We at FileCloud have been working on this for a very, very long time, and so we’re incredibly proud to present to you: FileCloud Aurora.

Today, we’re going to be covering one of the most important functionality additions to our new UI: Single-Click Actions.

For a comprehensive overview of all of FileCloud Aurora’s new features, please visit our previous blog post Introducing FileCloud Aurora! Brand New UI and DRM Capabilities!.

What Are Single-Click Actions?

We’re all about helping you and your team save time. FileCloud Aurora brings workflow efficiency to the next level with tons of quality of life updates that help you work fast. One of these is the introduction of single-click, or quick, actions.

What does this mean? Well, you might have noticed that when you hover over any file or folder in your library, a toolbar appears on the item. It might look something like this:

These icons represent all of the most commonly-used actions that you could ever need to take with your files. Hover over each icon to see what they do, or read on for a more in-depth explanation for each one. 

List of  Single-Click Actions


The first icon, which looks like an eye, is the Preview action. It lets you preview your files, folders and documents. As FileCloud Aurora integrates fully with Microsoft Office and many other file extensions, you’ll be able to open and preview all of your files without actually needing to download them.


The second icon, which looks like a downwards-pointing arrow is the Download action. It lets you download whichever file, folder or document you’re hovering over in one simple, quick click.


The third icon, which looks like a pencil, is the Edit action. Depending on your file type, you may be able to edit it directly in the browser or open the file in Office Online. Some files, of course, cannot be edited, such as video or audio files.

You may have noticed that at the end of each hover description, there are different keys in square brackets – for instance, [Enter] for the Preview icon. These are keyboard shortcuts, and are covered in our All About Keyboard Shortcuts blog post!


The fourth icon, which looks like three dots connected to each other, is the Share action. Clicking it will open the following window:

You’ll be able to copy the share link of your file, change sharing options, and remove the share from there on.

Direct Link

Next up is Direct Link, which looks like a hyperlink icon. This opens up the following window:

-Where you can copy a direct link to your file, maximizing efficiency when sharing files with your team.


This is the Copy quick-action, which looks like two squares overlapped over each other. Clicking this will open a window where you can select where to make a copy of the file.


This is the Move quick-action, which looks like an arrow. Clicking this will open a window where you can select where to move the file to.


Next up is the Delete quick action, which looks like a trash can. This is pretty self-explanatory — clicking this will delete the file in question.


Last but certainly not least is the Lock option, which looks like a, well, lock icon. This opens up the following window:

That lets you lock a file to indicate that you’re working on it. That way, your team won’t be able to modify the file and override your changes, or vice versa.


Gone are the days of tedious right-clicking and dragging stuff around. Need to copy or download something? Just hit the corresponding button! Need to move something to another folder? Before you start dragging it all over your screen, just hit the single-click action for it and a window will open for you to choose your destination folder. 

It might seem like a subtle change, but when your team is shaving seconds off every file action, it all adds up to save you and your team tons of time. Thanks for tuning in, and we hope that you’re enjoying our FileCloud 20.2 Update: FileCloud Aurora! 

System Admin Tools For 2020

System Admin


The world as we see it today is a connected one, that is heavily dependent on systems for everyday working. The huge dependency on computer systems also brought about the need to ensure that these systems keep running smoothly, and efficiently. Maintenance of these systems for ensuring their health, performance, optimization and better resource utilization, etc. lies on the shoulders of system administrators. Since this would typically mean a 24*7 job wherein multiple parameters of the system would have to be monitored, system administrators turn to tools that aid them in this. That brought about the emergence of system admin tools.

Since the system admin job involves a lot of activities, there are many tools that help them with this. Typically, these tools are used for monitoring, backups, resource allocation, user and account management, ensuring security, software installation and updates, performance tuning, and so on. At times, the system admin role may also overlap with that of a DevOps role in certain organizations. Hence, the various system admin tools available, that span across all the categories mentioned above. Here is a list of a few tools that are part of a system admin’s kitty.


A remote monitoring and management (RMM) tool that helps automate most of the repetitive tasks, Pulseway enables system admins to manage all their IT systems from anywhere. The interface is simple to use and also integrates many business needs into it like Cloud backup, business management, monitoring, endpoint protection, and more. Incredibly, it comes with a mobile app that can be used from any time, anywhere, making the job easier for system admins.

Pulseway allows system admins to do more work, by automating tasks like deployment, backups, reports, patching, and so on. With more than 6000 happy customers across 80 countries, Pulseway is best suited for businesses that want to do more for their customers, using efficient tools.


JumpCloud securely manages and connects your users to their systems, applications, files, and networks. These systems could be Windows, Mac, or Linux, and with a diverse array of supported protocols, any app or resource can be integrated and centrally controlled, irrespective of its location. It is a cloud-based service that makes it easier to adapt and enables users to become more productive, by better leveraging cloud resources. It helps with user, system, and server and access management through a single sign-on with multi-factor authentication. Its tagline happens to be ‘Directory-as-a-Service® is Active Directory® and LDAP reimagined’.


Almost on every system admin’s list for quite some time, Wireshark is a multi-platform network protocol analyzer that provides a microscopic level of inputs. It allows live capture as well as offline analysis, and the captured data can be browsed via a GUI, or via the TTY-mode TShark utility. It can capture many file formats and provides decryption support for many protocols as well. What makes it more interesting is that it is open-source (released under the terms of the GNU General Public License), with a huge community and a lot of documentation and training support for its users.


It is a simple, yet powerful automation solution, that helps many organizations to improve productivity by reducing complexity, reducing errors, and improved collaboration. Ansible increases accountability and compliance at the organization level and frees up more resources for innovation. System admins are free of repetitive tasks by automating them and can focus more on improving the health, efficiency, and performance of the systems.

It is also easy to learn and use, and helps with tasks like app deployment, configuration management, workflow orchestration, and more. It is also open-source and reliable and secure, with more than a quarter-million downloads per month. It comes with powerful modules like the visual dashboard, audit trails, centralized automation execution, role-based access control, and a lot more features that can improve speed, scale, and stability across the enterprise IT environment.


The solutions from Puppet help automate, how you continuously deliver, make compliant, remediate, and manage your multi-cloud environment. It is open-source and its IT management solution makes managing a global infrastructure safe and simple. It provides a single platform for all your cloud automation use cases, with a single audit trail. Puppet solutions power automation of some of the biggest brands in the world, and have a great community, knowledge-base, and support documentation.


A modern monitoring tool that helps a system admins see it all in one place – servers, clouds, metrics, apps, and the team. It seamlessly integrates SaaS and Cloud providers, automation tools, monitoring and instrumentation, source control and bug tracking, databases and server components, and more. It provides full visibility to monitor, troubleshoot, and optimize application performances. It also helps you to proactively monitor user experience, analyze and explore data logs, and build interactive, real-time dashboards for a better and easier understanding of the environment.


An open-source, password manager tool meant primarily for Windows, though it supports Mac and Linux through Mono. It helps manage multiple passwords in one database, locked securely using a master key. It supports the highest level of encryptions like AES and Twofish and is easily portable and the password list can be exported to or imported from multiple file formats. It provides multi-language support and can even generate random passwords, besides allowing for search and sort operations.


A free and powerful, source code editor, Notepad++, meant specifically for windows OS. It can edit multiple files using the tab feature, with syntax highlighting for over 70 languages. And it has a fast response time, despite working with large files, making it extremely useful for system admins working on large logs to look for something. It also supports macros and bookmarks, as well as advanced search and replaces. Drag and drop, zoom in and out, auto-completion, brace and indent guideline highlighting, and line operations are some of the other features that make it a powerful source code editor.


A partition and disk imaging tool that also allows for backup, recovery, and deployment, Clonezilla, supports multiple file systems. An image can be encrypted (AES-256) and also restored to multiple devices. It is licensed under the GNU General Public License (GPL) Version 2 and has versions for a single server as well as large server deployments. It is fast and allows OS installation as well as taking snapshots and then deploying the same over a network.


FileCloud provides features for easy business file sharing with complete control over business data. Whether you are sharing sensitive documents, large files, legal or medical data, FileCloud gives you a secure document sharing solution. FileCloud offers efficient and reliable data backup. FileCloud provides you with appropriate tools to protect data across all devices. Get auto Desktop and mobile backup, Unlimited File Versioning, and deleted files recovery. Secure backup and restore across all platforms (Windows, macOS, Linux) and devices (desktops, laptops, and smartphones) Granular permissions should allow you to give each user the access rights you want them to have — Write, read-only, restricted access, etc FileCloud also gives you the option to password protect your files, set an expiry date and has File change notification options.

FileCloud 20.1 Updates & End-Point Backup

FileCloud 20.1 Updates, End-point Backup and More!


FileCloud 20.1 Update

With Microsoft Office tag support, expanded notification configuration and much more, our 20.1 update is brimming with quality of life features! Read on for a little more detail about what we’ve done to take your file-sharing experience to the next level.

  • More Ways to Control and Configure NotificationsNotifications — can’t live without ‘em, but they sure can get annoying when they’re pinging you about stuff that doesn’t matter. So we made configuring everything about them including the type of notifications per user and per folder possible. Gone are the days of email clutter!
  • Built-In Microsoft Office Tag SupportFileCloud now allows users to import Office tags into FileCloud as well as to seamlessly search by them. With most organizations using Microsoft Office for daily professional and personal needs, FileCloud has always heavily prioritized Office integrations — and this update is no exception. Users can now even set up Content Classification Engine (CCE) and Data Leak Protection (DLP) rules based on Office tags. Check out our support documentation here.
  • Automatic Team Folder Permission UpdatesNew team folder permission updates are now applied during folder-syncing operations, ensuring that no one is able to access secure data at any point of the syncing process. Read up on how to enable this here.
  • SMS Integration & GatewaysFileCloud now seamlessly integrates with custom SMS gateways to enhance your security and convenience. Receive verifications and authentication from your custom SMS gateways to your personal mobile devices!
  • Integration with Multiple Identity Providers (IdP)FileCloud now integrates with multiple identity providers to support large enterprises who authenticate and authorize multiple IdPs.
    Learn More →


The Importance Of FileCloud Endpoint Backup

Seeing a system error and wondering if your data got lost or compromised, is terrifying. Realizing that your data is gone, irretrievable and you just lost hours and hours of tedious work is devastating. We know the feeling all too well and that is why we made sure you will never go through that rollercoaster of emotions!

You can configure endpoint backup settings for all users or specific users, and view all users’ backed up files from your FileCloud account. You can enable this from the server-side to allow “end” users to “backup” all their important files from different devices using FileCloud Sync (Windows/macOS) and mobile applications. All files automatically go to “My Files/Backups,” creating a new folder for each device.

Learn More →


FileCloud Tips: Microsoft Office365 Preview

FileCloud now supports Microsoft Office365 Preview and you can easily enable this from your FileCloud Admin Portal settings to allow users to see MS Office documents the way they are intended to be seen.

Learn More →


AirSend’s Email to Channel

Send messages via email directly to Airsend channels without having to switch windows! You can now choose to message your client directly via our desktop apps, reply from your email with your unique channel email address, or respond via email to a channel notification. Whether you’re using Airsend to communicate with your team, clients or acquaintances, your convenience is always our priority.


System Admin Guide to Continuous Integration


Continuous integration, continuous delivery, and continuous deployment (CI/CD) have existed in the developer community for decades. Some organizations have involved their operations counterparts, but many haven’t. For most organizations, it’s imperative for their operations teams to become just as familiar with CI/CD tools and practices.

Continuous Integration is a coding practice that essentially enables the development team to make and implement small changes in the code and version control methods quite frequently.

CI allows developers to continuously update changes to a single repository, from where automated builds and tests are made.

Usually, traditional system admins roles do not involve developing continuous integration pipelines, but if you are looking to dive into DevOps, getting hands-on experience with continuous integration tools is a must. Because most modern applications require developing code in different platforms and tools, the team needs a mechanism to integrate and validate its changes. The technical goal of CI is to establish a consistent and automated way to build, package, and test applications. With consistency in the integration process in place, teams are more likely to commit code changes more frequently, which leads to better collaboration and software quality.

Why use Continuous Integration?

  • Reduction of integration links: All projects employ more than one person to develop and it greatly increases the risk of errors during integration. Depending on the complexity of the code, it is possible that a lot of changes would have to be made. Here comes CI to the rescue and helps alleviate the issues as it allows for regular integration.
  • Higher quality of code: As the risks drastically reduce, a lot of the time and manpower can be diverted to creating a much more functionality-oriented code.
  • Code in version control works: Committing something that breaks the build immediately triggers a notification thereby preventing anyone from pulling a broken code.
  • Ease of testers: Retaining the different versions and builds of the code eases the work of QAs to understand, locate, and trace bugs efficiently.
  • Decreased deployment time: Automating the process of deployment eases and frees up a lot of time and manpower.
  • Increased confidence: The absence of a possible failure or breakdown gives developers peace of mind and thereby helps in delivering greater productivity and higher quality products.

As you learn more about these tools and start bringing these practices into your company or your operations division, you’ll quickly understand the need and importance of CI tools. You will increase your own productivity as well as that of others. With the growing number of available CI / CD tools on the market, teams may find it difficult to make decisions to select the right tools. Let’s get into the tools a bit more. We’ll briefly cover some highly-rated tools and share links to more information.



Jenkins is an automation tool written in Java with built-in plugins for continuous integration tasks. It is used to continuously build and test projects making it easier to integrate the changing codes to it.

Jenkins allows for faster delivery of software by working with a large number of deployment and testing technologies. It also accelerates the development phase via the automation of tasks. It is primarily a server-based app and requires a web server like Tomcat.

It allows a lot of flexibility and additional plugins that provide extra features, quite often not possible to find elsewhere unless you want to spend time on providing that by yourself. One of key features is pipelines that allow you to easily chain different jobs even across different repositories/projects.

Circle CI

CircleCI is a strong SaaS-based CI product that enables testing in the cloud. The YAML-based configuration system allows individual developers to test CI/CD changes and push changes effectively and quickly. The variety of different platforms that are supported allows multiple people to centralize on a single solution to avoid spread to other products. CircleCI is the worlds largest shared continuous integration and continuous delivery (CI/CD) platform, and the central hub where code moves from idea to delivery. As one of the most-used DevOps tools that processes more than 1 million builds a day, CircleCI has unique access to data on how engineering teams work, and how their code runs. Companies like Spotify, Coinbase, Stitch Fix, and BuzzFeed use us to improve engineering team productivity, release better products, and get to market faster.

Automated builds! This is really why you get CircleCI, to automate the build process. This makes building your application far more reliable and repeatable. It can also run tests and verify your application is working as expected.

Straightforward CI tooling. No need to spin up a CI server like Jenkins/ TeamCity to get things moving.



Bamboo provides the facility of automatic build generation. Using Bamboo a lot of time and manual efforts can be saved. Bamboo is very easy to use and has a simple user interface. Bamboo provides the facility of continuous integration and continuous deployment. So whenever any change is there in the bitbucket it automatically gets integrated with the previous code and generates the build. Bamboo provides the feature of running the automation test case on the build which saves a lot of time. Bamboo provides various options for configuration management. One can easily configure multiple branches, write a script, and can execute it. Bamboo provides various agents for building the build. Bamboo support community is always ready to help.


  • Versatility. I can use bamboo to manage my Java, node, or .NET build plans. I can use it to spin up Windows or Linux build agents, or install it on a Mac to build there as well.
  • Bamboo integrates with other Atlassian products like Bitbucket, Stash, JIRA, etc. If a company commits to the entire Atlassian stack then work can be tracked through the whole development lifecycle which is really useful.
  • Continuous Integration – Bamboo kicks off builds with each check in to our source control system, enabling faster consumption of changes, and quicker turnaround times when we encounter a problem.
  • Extensibility – Bamboo is capable of triggering multiple additional processes on completion of a build, including integration tests, deployment tests, and the like. This extensibility took us from a scheduled based system to a trigger-based system with little time waste.
  • Suite Integration – Bamboo’s easy integration with the rest of the Atlassian suite makes for huge efficiency gains. Being able to see which check-in triggered the build, as well as seeing what JIRA issues went into that check-in makes for complete traceability.


TeamCity is the go-to tool for getting the Builds and Deployments packages for a variety of platforms like .NET, Java and JS, etc. It unifies the build and deployments needs of all the diverse projects to a single platform and solves the build and release issues previously we faced and reduces the time to go to Prod. TeamCity will make sure your software gets built, tested, and deployed, and you get notified about that appropriately, in any way you choose. It’s a continuous integration and deployment server from JetBrains which takes moments to set up, shows your build results on-the-fly, and works out of the box. And best of all – it’s free by default.

Once set up and configured, it’s incredibly easy to test and release a piece of code, and diagnose problems across multiple teams using the online platform. Highly customizable, such as which outputs to test, what to save, and on which machines the tests should be run. Non-regression tests can also be run locally when developing them, to ensure they meet your requirements for robustness, before executing them remotely.



Gitlab is well suited for any project that requires revision tracking along with collaboration with other contributors. It supports the standard features of Git and adds its own recipe to the features that other Git SaaS providers offer as well. This includes issue tracking, pull request management, and recent artifact and package management. Gitlab has also been a leader in bringing CI to the repo ahead of its competitors. Of course, not all of these features need to be used. If all a dev needs are to track code, Gitlab can handle that just as well as any other cloud or self-hosted repo.


  • Pipelines: Gitlab Pipelines is an excellent way to get started with pipelines easily and without much overhead. And with it being all encapsulated within Gitlab itself, it makes integrating your code into that pipeline even easier. Just a little bit of code and VOILA. You have at least a minimum viable pipeline.
  • VCS: Gitlab is, of course, a great version control system.
  • Usability: Gitlab has really put a significant amount of focus on usability. They’ve drilled down and ensured that the way companies and individuals need to use the tool, they can.
  • Groups: Gitlab makes setting permissions on projects extremely easy. Other version control systems make it more difficult to set things granular enough, but GitLab allows you to group things in a granular enough way for your projects.


Travis CI

Travis works great for CI/CD pipelines. It’s easy to configure and has great integrations with tools you are probably already using like Github. It’s also compatible with many popular languages. It automates the build process and handles test cases. You can run test cases on mac and Linux both at the same time. Configuring Travis is easy using a lightweight YAML file. We don’t need to set up our own server to run it. It also provides free support for public repositories. Ready and easy to use, you don’t need any extra configuration like other CI toolss like Jenkins. simply integrate GitHub or version control system, whenever you push the code it’s tested and integrated. Multiple jobs allow you to run and test simultaneously on different environments and OS. Free for your public projects, you don’t have to pay for your test and open source projects. You don’t have to maintain hosting server, Travis CI handle, and maintain updates and hosting server. Plugin and integration with third-party tools are available but it’s limited.



The best thing about Buddy is its intuitive UI where you can set up deployment pipelines easily. The UI really helps when we want to introduce CI/CD culture to the whole engineering team since they can try to set up themselves using a few button clicks, rather than learning about some YAML configuration (it’s still available for some advanced users though).

It also has a plethora of built-in actions that connect with so many services that cover almost all of our use cases. Even if when we can’t find what we need, its integration with Docker Hub is really helpful so we can set up our own custom Docker image.

Buddy’s GUI is really awesome and to create pipeline’s using Buddy is really really simple. For a beginner who doesn’t understand CICD much, start using Buddy and you will get to know everything related to CICD in a day. Buddy is also constantly improving and new features every day.


Best Practices for Secure File Transfer with FileCloud


It’s a new decade, and many businesses are handling customers all over the world. Being aware of the options and using secure file transfer methods is vital for any business to stay on top of its game. One way to be sure you are safeguarding customer data is to look for security features that protect from and avoid man-in-the-middle attacks. This type of attack is similar to eavesdropping on a conversation. There is a possibility your data could be intercepted by someone “listening in” during the transfer between your computer and a server or other device. Secure file transfer methods make sure eavesdropping, as well as other breaches or privacy violations, do not happen.

Today, secure file transfer methods are designed to keep your company from experiencing a breach of data during transfer. There are a few additional things that can reduce security issues.

Sharing Permissions 

Utilize permissions for your file-sharing account. When setting up new users take advantage of granular permissions. Don’t get caught wishing you could find out who accessed your files or how a file got deleted. Granular permissions should allow you to give each user the access rights you want them to have — Write,  read-only,  restricted access, etc FileCloud also gives you the option to password protect your files, set expiry date, and has File change notification options.



Activity logs & Notifications

Email can get messy and is not a recommended method to securely send information. Use activity logs and notifications to track your secure file transfers. The activity log on your file-sharing account will show all user activity and can be filtered by day or username. Notifications alert you when specific files have been uploaded or downloaded. Notifications help you keep track of the who, what, and when for your file shares.

With FileCloud you can receive notifications about files or folders stored in all types of folders. Your administrator sets your default notification settings which determine whether notifications are sent to you when:

  • a file or folder is shared with you
  • one of the following actions is performed (by you or another user) on a file or folder you have access to:
    • a file or folder is uploaded
    • a file or folder is downloaded
    • a file or folder is shared
    • a file or folder is deleted
    • a file or folder is renamed
    • a file is updated
    • a file is previewed in the browser or one of the mobile apps
    • a file or folder is locked


Data Backup

FileCloud offers efficient and reliable data backup. FileCloud provides you with appropriate tools to protect data across all devices. Get auto Desktop and mobile backup, Unlimited File Versioning, and deleted files recovery. Secure backup and restore across all platforms (Windows, macOS, Linux) and devices (desktops, laptops, and smartphones). Automatically backup media files from iOS and Android devices to FileCloud. Automatically stores versions of files as they change and make it easy to get back to previous versions. Administrators have full control over the number of versions to keep. Recycle bin support allows deleted files and folders to be recovered by users or administrators quickly and effectively. Even deleted files from network folders can be restored.


Security and Compliance

The truth is, your employees may find USB sticks, external hard drives, webmail, and smart devices more convenient than traditional organizational tools when it comes to transferring files. Unfortunately, this creates a gap in control and visibility for IT departments, exposing companies to compliance and security risks. As a result, organizations are slowly shying away from consumer-grade secure document sharing to more secure options.

FileCloud offers multiple data protection compliant solutions like HIPAA compliance, FINRA compliance, and EU data residency. FileCloud enables enterprises to run their own HIPAA or FINRA compliant enterprise file share, sync, and endpoint backup solution. FileCloud security architecture offers security, privacy, and data ownership to enterprises. FileCloud also supports federal security standard FIPS 140-2. With FileCloud, you can be rest assured that your corporate data is well protected in your servers and employee devices.


Having a secure file sharing solution not only gives you peace of mind about data security, but it also protects your reputation in the process. In addition to adopting these basic securities measures, you need to consider the program you use to facilitate your secure file transfers. Adopting the right solution could mean the difference between a great reputation and the loss of business.

FileCloud provides features for easy business file sharing with complete control over business data. Whether you are sharing sensitive documents, large files, legal or medical data, FileCloud gives you a secure document sharing solution.

Working from Home and the Threat to Business Continuity

work from home

In January 2020 the World Health Organization declared the outbreak of a new coronavirus disease to a Public Health Emergency of International Concern. As the epidemic’s full implications became apparent, governments across the world begun issuing stay-at-home orders, lockdowns, and travel restrictions in a bid to halt the spread of the COVID-19 coronavirus and prevent the overload of public health systems with patients affected by the disease. This forced many organizations and institutions to rethink their mode of operations. The result was remote work becoming a requirement for many, sometimes overnight.

With the ubiquity of mobile devices in this digital age and most business applications being available via cloud services, migrating workloads from the office to home should be a relatively simple process. However, very few organizations were prepared for large-scale remote work.

For most IT teams, the challenge lies in ensuring their IT infrastructure can handle most, if not all their employees working remotely. But for the sake of the health and safety of their workforce, it’s a challenge they have to rise to while ensuring the continuity of their business operations during an unprecedented pandemic situation that has affected everyone across the globe.

Remote Working is Not a New Phenomenon

Remote work is an already attractive option for employees who prefer greater flexibility. It completely eliminates commuting time for workers with familial obligations; and as the workforce segment that supports aging family members continues to grow, the demand for flexible working arrangements will only rise. However, while remote work is being increasingly demanded by workers and facilitated by technology, according to Gartner, most enterprises (93%) defer to supervisors to decide who works from home and when they do. But due in part to an innate lack of trust, only 56 percent of supervisors actually all their employees to work from home – even if there are supporting company policies in place to facilitate it.

Enterprises that have already invested in the cloud from an infrastructure perspective, or largely rely on Software-as-a-Service (SaaS) apps are naturally at lower risk of experiencing technical difficulties during this time. The inescapable use of remote work for business continuity should signal to all enterprises that it is time to revisit their remote work policies and redesign them for robust use.

While the shift to remote work has accelerated, we believe that employees and organizations will begin to see the advantages of remote work and become better adapted to to it. Though the coronavirus disease has certainly affected work patterns in the short term, there will almost inarguably be far-reaching global technology implications, including increased demand for solutions such as Virtual Desktop Infrastructure.

Infrastructure is Key to WFH Productivity

A lot of the technology currently used to facilitate remote working has been around for decades. For most organizations, the underlying software, hardware, and support infrastructure have been designed to accommodate a small subset of the workforce. Performance, reliability, security, and the availability of applications and data are crucial. With the number of new at-home students and workers in the tens of millions, organizations are rushing to shore up their infrastructure to support new users while ensuring basic performance expectations are met.

Apart from the clear-cut, procurement of the correct tools, licenses, and deploying them quickly across a broad workforce: IT teams must also make sure that their colleagues can reliably utilize the resources. This includes access to basic apps like email, file sync, and sharing, via remote desktop access and other virtual desktop infrastructures.

Naturally, unanticipated stress has been put on remote working technologies, leading to security and bandwidth concerns. Whether or not existing on-premise enterprise setups are able to cope with the sudden, yet prolonged increase in users trying to access the organization’s applications remotely, solely rests on the quality of the network connections available.

Right of the bat, most businesses tried to access how much capacity they’ll require by running one-day tests. Agencies like NASA ran remote networking stress tests to understand what the impact of adding thousands of new remote workers would have on their networking capacities. Teleworkers will have to connect to their data repositories, applications, and other offerings to maintain business continuity. The effective transmission reliability and throughput of a VPN in combination with the internet can easily become a bottleneck.

A VPN is Not the Answer to All Your Problems

Software-as-a-service (SaaS) cloud is becoming mainstream among enterprises. Several applications now run in the cloud, making it easier than ever to leverage and acquire those apps to make enterprises more efficient and agile. The cloud has given rise to a world of mobility where workers can be productive from anywhere since access to applications is no longer tied to physically being in the office. With the coronavirus outbreak triggering a sudden influx in the need for work-from-home precautions, the need for easily accessible applications has never been more apparent.

As it stands, typical enterprise infrastructure involves applications that are hosted and installed within the office, and are only accessible from the confines of the office network. Most file servers are either hosted on Windows or Linux, with the two main types of file systems being NFS (Network File System) on UNIX/Linux and CIFS (Common Internet File System) on Windows. A vast majority of the organizations that have been in operation for a while, typically use CIFS or NFS file systems. An NFS allows remote hosting of resources over a network, by mounting file systems that facilitate the interaction as is if it was local.

CIFS is also cross-platform and folders can be shared over a local network or across the internet. Accessing files on a CIFS network over a VPN via a mobile network is possible, but the connection can be patchy, access to client applications will likely be limited and extremely slow. Whenever a remote worker needs to access an enterprise application or document, they must first turn on the VPN application, which in turn grants them access to information sitting within the company network. For VPN applications to work optimally, they require adequate network throughput and VPN hardware capacity. While this is easy to achieve on systems built for about 20 -30 percent of employees working remotely, it can easily be overloaded when this number skyrockets; regardless of how many licenses are available for the VPN application.

Overloaded VPN concentrators may require massive injections of rules managers and hardware. However, VPN hardware is not the type of thing you can easily pick up at your local BestBuy, it calls for a rigorous procurement and installation process that could end up taking weeks. The VPNs that organizations typically rely on to manage and access critical files have multiple limitations. With full office closures happening overnight, and workers being asked to work from home. Several of those workers are going to turn on their VPN to access business files and applications, only to discover that their connections are down. IT support staff should brace themselves for a flood of calls from frustrated colleagues trying to navigate through the nuances of remote connectivity, support issues will be hard to diagnose remotely because they involve third-party hardware and networks.

Luckily, several apps that once had to be hosted within an office network have since transitioned to the cloud. Productivity applications like Google G-Suite, Microsoft Office 365, and communication platforms like slack have seemingly eliminated the need for VPN applications. Sadly, VPNs are still central to how several enterprise workers access their files and applications. And an overnight migration to a public cloud architecture overnight is not a practical solution for them.

Making the Most Out of What You Have

In the event user or performance, issues arise. There are steps that IT teams can take to reduce the load on networks and various on-premise IT resources. Managing employees’ expectations is important and making them aware of possible degradation in the performance of the services and applications they rely on to complete their tasks may lessen the amount of stress the IT department is under.

To ensure workers remain productive and business continuity is not stifled, IT teams must find a way of enabling mobile access and file sync for data that lives behind a cooperate firewall without the need for a VPN and without re-configuring permissions whilst utilizing existing LDAP or Active Directory authentication. Several enterprises have heavily invested in scalable Network Attached Storage solutions like EMC Isilon and NetApp Filer. Solutions like NetApp provide low latency access to files as network shares via a WLAN or LAN. But they can still be used as part of a remote-work infrastructure with virtual desktop infrastructure applications.

There are no one-size-fit-all software, infrastructure or bandwidth that an organization can purchase to solve this problem by using legacy approaches, including the use of VPNs. Fortunately, the solution is simple. A solution that will allow enterprise data to sit securely on-premise at the office, and still be accessible to users via the cloud and not through a VPN. FileCloud can help remote workers collaborate, access, and share files securely with ease by mapping existing file servers on EMC Isilon and NetApp Filer as network folders and instantly make them available on networks outside the office. FileCloud easily integrates with existing active directory, NTFS file permissions, and network shares; giving employees low latency access to large files without having to recreate complex permissions.

Image courtesy of

Author: Gabriel Lando

Are System Admins Obsolete as Everyone is Moving to Server-less Infrastructure?


With everything moving to the Cloud and server-less infrastructures, are sysadmins becoming obsolete? What can sysadmins do to stay relevant in IT?

System administration roles are diversifying as system engineers, application engineers, DevOps engineers, DevOps engineers, virtualization engineers, Release engineers, Cloud engineers, etc. Because of the scale in cloud computing and an additional layer of Virtualization, infrastructure engineering is managed as a code by using automation tools such as Chef and Puppet. The rise of computing and analytics has given tremendous elasticity and stress to the back-end infrastructure by deploying distributed computing frameworks such as Hadoop, Splunk, etc. Applications are scaling horizontally and vertically across the data centers. The emergence of cloud has shifted the traditional role of a system admin to the cloud engineer but infrastructure design and basic system services such as mail server, DNS, DHCP remain intact.  

| Learn Linux

If you want to make your career as a Linux system administrator then you need to learn the basics of Linux along with the hands-on practicals. I would recommend you to go for Redhat Certified System Administration full course. The videos are available on Youtube and torrent as well. RHCSA is an entry-level certification that focuses on actual competencies at system administration, including installation and configuration of a Red Hat Enterprise Linux system and attaches it to a live network running network services.

| Get comfortable with scripting language & automation tools

Bash for everyday scripting, putting things in cron, use it to parse logs. Bash is not limited to it by itself, you want to learn a little sed and awk, and focus a lot on regular expressions. Regular expressions can be used in most languages.

After you have spent a few weeks/months learn python. After a few weeks with python, you will easily see where it makes sense to use bash vs python.

Perl is a good general-purpose language to use, if you deal with a lot of files or platform-independent sysadmin automation, including Solaris & AIX. It’s a bit hard to learn but easy to use

Some of the important automation tools for system admin are

  1. WPKG – The automated software deployment, upgrade, and removal program that allows you to build dependency trees of applications. The tool runs in the background and it doesn’t need any user interaction. The WPKG tool can be used to automate Windows 8 deployment tasks, so it’s good to have in any toolbox.
  2. AutoHotkey– The open-source scripting language for Microsoft Windows that allows you to create mouse macros manually. One of the most advantageous features that this tool provides is the ability to create stand-alone, fully executable .exe files, from any script, and operates on other PCs.
  3. Puppet Open Source – I think every IT professional has heard about Puppet and how it has captured the market during the last couple of years. This tool allows you to automate your IT infrastructure from acquisition to provisioning and management stages. The advantages? Scalability and scope!

| Stay up to date with the current generation of infrastructure standards & practices

  1. Analytical skills: From designing to evaluating the performance of the network and the systems
  2. People skills: A network and computer systems administrator interacts with people from all levels of the organization.
  3. Technical know-how: Administrators have to work with different kinds of computers and network equipment, so they should be familiar with how to run these
  4. Quick thinking An administrator must be very responsive and must be able to quickly come up with solutions to every problem that pops up.
  5. The ability to multi-task Administrators often deals with different kinds of problems on top of what they usually do.

It’ll be systems administration under a different title like “Cloud Engineer” and do things differently, probably using automation tools and infrastructure code management and deployment.

Coding, automation, and scripting are all very important skills to have now and for the future.

Ultimately someone will need to admin the systems and deal with the operations of the tech stack. So, yes it has a future.  The type of company varies tremendously, any company could use a sysadmin.  It may be an unexciting job of maintaining a local file share and email server, or something challenging like keeping a thousand servers running.

VPN vs VDI vs RDS: Which Remote Access Is Best For You?

As the world slowly moves to inevitably work from home, most organizations have begun actively exploring remote work options. As such, security has become one of the prime considerations of businesses. After all, ensuring the safety of your organizational data and processes is just as important as ensuring business continuity. Virtual digital workspaces managing seamless workflows among employees spread across the globe, of course, must aim to consistently better their user experiences.

However, hackers also thrive during such crises as they know that many people may willingly or unknowingly compromise on safety aspects to meet their business needs. Any breach of data can prove to be a costly affair, especially when taking into account the loss of reputation, which takes a long time to overcome, if at all. It is important then, to understand and evaluate the remote work options, and choose wisely. The most popular options considered are Virtual Private Network (VPN), Virtual Desktop Infrastructure (VDI) and Remote Desktop Services (RDS).

What is a VPN?

In an online world, a VPN is one of the best ways you can ensure the security of your data and applications while working remotely. This is not just about logging in and working securely every day. It also protects you from cyber attacks like identity thefts, when you are browsing the internet through it. This is simply an added layer of security through an application that secures your connection to the Internet in general if using a personal VPN, or to a designated server if using your organizational VPN.

When you try to connect to the Internet through a VPN, it is taken through a virtual, private channel that others do not have access to. Then, this virtual channel (usually a server hosting the application) accesses the Internet on behalf of your computer so that you’re masking your identity and location; especially with hackers who are on the prowl. Many VPN solution providers ensure military-grade encryption and security via this tunnel. Usually, the security encryption differs based on the need of the individuals and organizations choose what works best for them.

VPNs came into being in this every concept of enterprises wanting to protect their data over the public as well as private networks. Access to the VPN may be through authentication methods like passwords, certificates, etc. Simply put, it is a virtual point-to-point communication for the user to access all the resources (for which they have requisite permissions) of the server/network to which they are allowed to connect. One of the drawbacks in this could be the loss in speed due to the encrypted, routed connections.

What is VDI?

This is used to provide endpoint connections to users by creating virtual desktops through a central server hosting. Each user connecting to this server will have access to all resources hosted on the central server, based on the access permissions set for them.  So, each VDI will be configured for a user. And it will feel as if they are working on a local machine. The endpoint through which the user accesses the VDI can be a desktop, laptop, or even a tablet or a smartphone. This means that people can access what they want, even while on the go.

Technically, this is a form of desktop virtualization aimed at providing each user their own Windows-based system. Each user’s virtual desktop exists within a virtual machine (VM) on the central server. Each VM will be allocated dedicated resources that improve the performance as well as the security of the connection. The VMs are host-based; hence, multiple instances of the VMs can exist on the same server or a virtual server which is a cluster of multiple servers.  Since everything is hosted on the server, there is no chance of the data or identity being stolen or misused. Also, VDI ensures a consistent user experience across various devices and results in a productivity boost.

What is RDS?

Microsoft launched Windows Terminal Services with MS Windows 2008, and this later came to be known as remote desktop services. What it means is that a user will be allowed to connect to a server using a client device, and can access the resources on the server. The client accessing the server through a network is a thin client which need not have anything other than client software installed. Everything resides on the server, and the user can use their assigned credentials to access, control and work on the server as if they are working on the local machine. The user is shown the interface of the server and will have to log off the ‘virtual machine’ once the work is over.  All users connected to the same server will be sharing all the resources of the server. This can usually be accessed through any device, even though working through a PC or laptop will provide the best experience. The connections are secure as the users are working on the server, and nothing is local, except the client software.

The Pros and Cons of each

When considering these three choices of VPN, VDI, and RDS, many factors come into play. A few of these that need to be taken into account are:

  1. User Experience/Server Interface – In VDI, each user can work on their familiar Windows system interface so that it increases the comfort factor. Some administrators even allow users to customize their desktop interface to some extent, giving that individual desktop feel which most users are accustomed to. This is not the case in RDS wherein each user of the Server is given the same Server interface, and resources are shared among them. There is a very limited choice of customization available, and mostly all users have the same experience. Users will have to make do with the Server flavor of the Windows systems rather than the desktop flavor that they are used to. The VPN differs from either of these in that it only provides an established point to point connection through a tunnel and processing happens on the client system, as opposed to the other two options.
  2. Cost – If cost happens to be the only consideration, then VPN is a good choice to go with. This is because users can continue to use their existing devices with minimal add-ons or installations. An employee would be able to securely connect to their corporate network and work safely, without any eavesdropping on the data being shared back and forth. The next option is the RDS the cost of which will depend on a few other factors. However, RDS does save cost, time and money, with increased mobility, scalability, and ease of access, with no compromise on security. VDI is the costliest of the three solutions as it needs an additional layer of software for implementation. Examples of this software are VMware of Citrix which helps run the hosted Virtual Machines.
  3. Performance – When it comes to performance, VDI is a better solution, especially for those sectors that rely on speed and processing power like the graphics industry. Since the VDI provides dedicated, compartmentalized resources for each user, it is faster and makes for a better performance and user satisfaction. VPN connections, on the other hand, can slow down considerably, especially depending on the Client hardware, the amount of encryption being done, and the quantum of data transfer done. RDS performance falls in between these two options and can be considered satisfactory.
  4. Security – Since it came into being for the sake of ensuring the security of the corporate data when employees work outside the office, VPN does provide the best security in these three remote work options. With VDI and RDS, the onus on ensuring security lies with the administrators of the system, in how they configure and implement the same. But, it is possible to implement stringent measures to ensure reasonably good levels of security.
  5. End-User Hardware – Where VDI and RDS are considered, end-user hardware is not of much consequence, except in using to establish the connection. In these cases, it is the Server hardware that matters as all processing and storage happen on it. But in ensuring VPN connections, end-user hardware configurations are important as all processing happens on this after establishing the secure connection. VDI offers access to clients for Windows, Mac and at times, even for iPhone and Android. RDS offers clients for Windows and Mac; however, a better experience is delivered with Windows.
  6. Maintenance – VPN systems usually require the least maintenance once all the initial setup is done. VDI, however, can prove to be challenging, as it requires all patches and updates to be reflected across all VMs. RDS needs lesser maintenance than VDI, but more than that of VPN systems. At best, RDS will have to implement and maintain a few patches.

The Summary

Looking at the above inputs, it is obvious that there is no best solution that can be suggested for every business. Each enterprise will have to look at its existing setup, the number of employees, the business goals, the need for remote work, the challenges therein, and then decide, which factor needs to be provided more weightage. If the number of employees is less, perhaps VPN or RDS may be the better way to go. But, if your need is of better performance owing to the graphics kind of work, then we highly recommend taking a look at the VDI option. VDI may be the way to go if you have a large number of employees as well.

System Admin Guide To Dev Ops


DevOps is another buzzword that has made waves in the IT industry in the last few years. Earlier, we used to have System Administrators, and traditionally, they only used to configure, monitor and maintain the systems. These would include the critical systems, including the servers, and system and one of the key SLAs for System administrator performance would be system downtime and overall performance levels. This role was mostly removed from the development process, and clarity was there on their responsibilities.

However, when the nature of software development evolved from the traditional waterfall to the current Agile models, all roles associated with it had to evolve. Thus happened DevOps, which in a way, is a complex role, with overlapping responsibilities. This is a combination of development and operations and this is a culture and not a technology. Due to the nature of the Agile development, the DevOps role and responsibilities are intertwined closely with that each phase of the Agile development model. So, there are no clear lines drawn as in the earlier case of System Admin roles.

A DevOps person has to be active from the design phase of the software project. The responsibilities include the creation of a development pipeline, Quality Assurance, as well as traditional System Admin activities. There are overlapping activities for the whole team. This improves coordination within the team to ensure aggressively, Agile delivery timelines are met. There is seamless coordination among the team members, and possibly, a developer may have to perform a production activity and so on. In smaller organizations, there may not be a separate DevOps person or role, and it may well be the developer/tester/System Administrator who does the same.

The Demarcations

So, in a way, while the traditional System Administrator role would be limited to systems and their configurations, a DevOps role would be also involved in the deployment of the software and the operations therein. Unlike clearly defined expectations of a System Administrator role, here the challenges are more and complex. The opportunities to learn and grow and are also equally in abundance.

Many developers move into this role when they get into deployment and operations. So do System Administrators who can code, and also understand the various development phases and nuances. System Administrators may not necessarily have a holistic view of the technical environments they work in, as DevOps engineers do. For DevOps personnel, it is a must, and they always have to stay up-to-date with the challenges thrown in, by upgrading their competencies to meet the same.

The DevOps Role

As mentioned above, a DevOps role would need to have a good hold of the complete picture of a software project. Hence, they need to know what happens during the design phase, and what is planned in the pipeline at every scheduled sprint and so on. They may have to don multiple hats by collaborating with developers and other IT operations staff, to ensure efficient implementation of application development and delivery processes. This would mean a thorough understanding of software development and all associated processes. The other expected competencies could be knowledge of version control tools, application build, test and deployment processes including automation, server configuration, and monitoring, etc.

DevOps personnel will be involved in Continuous:

  • Planning
  • Code Inspection
  • Integration
  • Testing
  • Builds
  • Deployment
  • Monitoring
  • Improvement
  • Innovation

Hence, it is a given that they would be highly technical and aware of the emerging industry trends as these activities rely heavily on tools. An ability to quickly understand, analyze and manage the operational challenges arising across the multiple phases of software development and deployment, is a great asset. The mindset and approach of a DevOps person would be at a much higher level than that of a traditional System Administrator. Their roles are more complex, challenging, constantly evolving, and highly critical in software teams.

Key Skills

Considering that the role is a holistic one, it is obvious that they DevOps people would need to possess a set of skills, apart from being able to pick up new ones on the go. Typically the skills that are required for a DevOps role are:

  • System/Application/Resource/Database Administration
  • Configuration Management
  • Scripting/Automation
  • A good understanding of software development processes
  • Continuous Integration/Continuous Deployment (CI/CD)
  • Cloud Computing

The first three in this list are skills, which competent System Administrators would have, depending on the nature of the assignments they have handled. The last three are the typical new age DevOps related skills which they have to pick up if they want to make the transition to a DevOps role. There is a slight chance that certain System Administrators may not have done any scripting at all. In such a scenario, scripting and automation may also have to be picked up.

Also, picking up on configuration management tools like Puppet, Ansible, Chef will help as they help in configuring and automating processes and applications at a high scale. At times, DevOps people may also have to learn to integrate build management tools like Gradle and Maven, into the CI platforms, for better agility. Version control is a must and this is where GitHub can help.

Why New Skills?

Since DevOps is not an isolated role and is deeply integrated into the software development and deployment process, a good understanding of the same is imperative. This helps them get involved, analyze and contribute productively at all stages including the deployment stage.

The CI/CD pipeline is a very important aspect of a DevOps role and hence this is a new skill that will have to be necessarily picked up. In the earlier waterfall model, the software used to get deployed only after it is full and final. Today, though, the Agile model works differently; developers can continuously update the software changes into a centralized repository. Automated builds and tests are enabled in this repository, making it possible for the deployment of the various versions of the software.

Many tools allow this continuous integration and deployment, and a few of the CI tools that are popular are Jenkins, Buddy, Bamboo, TeamCity, Travis CI, GitLab CI, Codeship, etc. These tools are designed to help minimize the application downtime during the continuous integration and deployment of the application.

There are continuous monitoring tools as well that can help with tracking the performance of the various aspects of the ecosystem, with logs, alerts, and notifications. Lansweeper, CloudWatch, Stackdriver, Snort, SolarWinds, AppDynamics, New Relic, BigPanda, PagerDutyand Nagios are a few examples.

Cloud Computing

Most organizations have already moved to the Cloud or are considering the same. Cloud ensures high availability and convenient access from anywhere in the world, at all times. So, with more and more apps being deployed to the Cloud, a good understanding of Cloud Computing also is a must for a DevOps person. Also, they would need to configure servers and services on these Cloud hosts. So understanding and being able to efficiently implement Software as a Service (SAAS), Platform as a Service (PAAS), and Infrastructure as a Service (IAAS) is a key competency. This may involve learning how applications are containerized, deployed and supported on Cloud infrastructure.

Cloud Computing in combination with the DevOps culture, provides enterprises with an unbeatable advantage in terms of agility, scaling, and consistency in code quality and application performance.  The various Cloud models of Public, Private and Hybrid, make available immense cost-effective resources to organizations, making the cultural change to DevOps possible. Cloud resources also enable better collaboration and communication among the various team members, making it possible for them to work seamlessly across global teams. A whole lot of the aspects needed for CI/CD, like high availability, security, fault tolerance, on-demand infrastructure, measurement and monitoring, resource management and configuration, provisioning, and Data Governance is built into most Cloud Solutions.

So, the combination of Cloud Computing and DevOps becomes a rather potent one; one that has the power to enable organizations to emerge victorious in their quest to beat the competition, in the time to market as well as ensuring quality products. The improvement in productivity is just another bonus gained in the bargain.

Retention in Record Management Software

Records management software are computer programs designed to systematically control records within an organization. Such software can help manage records in any format, and many programs have advanced capabilities for managing electronic records. However, evaluating, selecting, and purchasing records management software depends on several factors. No one system is right for all users, and the system you choose should fit the size and complexity of your organization.

But records management systems also serve a more general function: they greatly simplify the many workflow processes required to create, distribute and maintain accurate records. They have this in common with (the more general-purpose) document management software and for this reason, there are many similarities between the two.

As mentioned above, records are a very specific type of document that can serve as legal proof or evidence. Think of it like squares and rectangles: a record is a type of document, but not all documents are records. As such, records are often necessary in order to prove compliance with regulations and laws.

In some industries, compliance must be shown at periodic intervals. For example, a food distributor uses records to demonstrate compliance with food safety regulations and may need to do so every year or every quarter, as mandated by local regulations.

There are many general functions that you should look for in records management software like

  • Help features: These can include user-friendly online tutorials, easy-to-understand error messages, and support and training by the vendor
  • Menus and commands: These should be easy to understand and should be organized in a logical way
  • Speed and accuracy: You should be able to enter and retrieve data quickly, with reduced retrieval times
  • Generation of standard reports: The software should generate reports easily and print them out as they are seen on the screen
  • Ease of use: The software itself should be user-friendly; you should be able to use it the day it is installed
  • Customization: You should be able to customize the software to meet the specific needs of your organization without sacrificing the benefits of standard practices
  • Ability to manage records throughout their life cycle: Many organizations purchase software that manages records throughout their life cycle—from creation and active use to inactive use and disposition. Be sure to purchase software that meets your records life cycle requirements.
  • Ability to manage records in all formats: The software should help you manage records in any format, if necessary, including paper, electronic, micrographic, and audio-visual records
  • Free-text searching across fields: Any text maintained by the software should allow free-text or keyword searching across fields


Retention in Record Management

Companies face legal and regulatory requirements for retaining records. Each company’s specific requirements should be detailed in a Record Retention Policy and accompanying Record Retention Schedule. These provide both a legal basis for records compliance as well as a consensus on which records should be saved for how long they must be kept. And, equally important, when they should be deleted (destroyed).

Records retention policies and procedures should include:

  • Archiving policies for various document types. In other words, how long these different document types stay “active” before they can be sent to the library archive.
  • A destruction policy for setting how long information is retained before it is destroyed. This avoids destroying vital documents prematurely, or not at all. Retaining everything indefinitely is not good records management.
  • Procedure for capturing the proper information for regulatory reporting. It is critical for organizations to assess their current state of preparedness to determine how well they can safely and efficiently respond to an e-discovery request or governmental inquiry.


Some of the Major players in Record Management and Document handling includes

  1. Alfresco Content Services-  Alfresco Content Services provides open, flexible, scalable Enterprise Content Management (ECM) capabilities. Content is accessible wherever and however you work and easily integrates other business applications. Alfresco puts sophisticated technology to work to automate the records management process, making end-to-end processes happen automatically and invisibly. Using the system is easy, and compliance “just happens,” with little or no user intervention.
  • Easy-to-set rules and metadata automatically drive what needs to be declared as a record and where it should be filed in the File Plan, while records remain easily accessible by sanctioned users
  • Powerful rules and business processes help file, find, review and audit records, saving time for users and administrators
  • Configurable File Plans provide effortless control over retention schedules for review, hold, transfer, archive and the destruction of records
  • Supports records declaration directly from your desktop
  1. OpenText Records Management (formerly Livelink ECM – Records Management) delivers records management functions and capabilities to provide full lifecycle document and records management for the entire organization. This product allows your organization to file all corporate holdings according to the organizational policies, thereby ensuring regulatory compliance and reducing the risks associated with audit and litigation. OpenText Records Management provides options for classifying information quickly and easily. Classify information interactively, with a single click, automatically inherit retention schedules and classifications by moving records en masse into folders, classify records based on roles or business processes or auto-classify content. Further, increase efficiency through the automatic import of retention policies and other data into OpenText Records Management. OpenText Records Management maps record classifications to retention schedules, which fully automates the process of ensuring records are kept as long as legally required, and assuredly destroyed when that time elapses. When a retention schedule expires, final decisions can be made to destroy the object, retain it for a period of time, or keep it indefinitely
  2. FileCloud – FileCloud retention policies deliver control and compliance for the files and their folder groupings in the Cloud. Retention policies allow administrators to automate some processing related to protecting data and help secure digital content for compliance and enhancing the management of digital content for other internal reasons. Without the right systems within your cloud solution to discover and essentially preserve the sensitive content, the time and costs spent on litigation and handle legal cases can quickly spiral out of control. FileCloud retention policies are created and attached to stored files and folders. These special policies allow administrators to define the conditions that enforce a set of restrictions on how each file or folder can be manipulated. For example, administrators can create a Retention Policy that disables a user’s ability to delete or edit any of the files and folders named in the policy. To resolve the issue of conflicting policies, FileCloud ranks retention policies by what best protects and retains the digital content.
  • Admin HoldOutranks all other policies and prevents any update or delete of digital content for an indefinite period of time.
  • Legal HoldFreezes digital content to aid discovery or legal challenges. During a legal hold, file modifications are not allowed.
  • RetentionIdentifies digital content to be kept around for an unlimited amount of time before being deleted or released.
  • ArchivalMoves and stores old organizational content for the long term. No Deletion is allowed until a specified time period is reached. After this time, content gets moved to a specific folder.
  • Trash RetentionCan be configured for automatic and permanent deletion of all files in the Trash bins or to expire with no actions
  1. Box – Box allows customers to configure automated policies, known as retention policies, to control the preservation and deletion schedules of their enterprise documents. Retention policies enable the business to maintain certain types of content in Box for a specific period of time and to remove content from Box that is no longer relevant or in use after a specific period.

Box Governance offers to support metadata-driven retention polices, where retention policies can be applied to individual files based on custom metadata. This also enables customers to configure retention policies at the file level in addition to at the global and folder levels.

Admins and Co-Admins who have permission to manage policies can create any type of retention policy, including a metadata-driven retention policy.

  1. Microsoft Sharepoint Online – Records Management is one of the key components of an Enterprise Content Management (ECM) System such as Microsoft SharePoint. There are a couple of ways in which you can manage records in SharePoint and SharePoint Online. You can create a dedicated Records Center Site that serves as an Archive, and documents are copied to the archive based on the retention policy. Another option is to manage records “in place” and this is where you can leave a document in its current location on a site, declare it as a record, and apply the appropriate security and retention properties. SharePoint Online can be used to create a complete records management system using its “off-the-shelf” capabilities. Planning the structure of the record center and the libraries that will be contained within it is a key consideration, but fundamentally the process is, at its core, quite simple:
  • Content Types for the documents.
  • Policy Rules for moving documents to the Records Center and the Exceeds Retention Records Center.
  • Content Organizer Rules for distribution to the correct Library in each Record Center.
  • Library List Views for automated approval notifications.