CiGen stands for ‘computational intelligence generation’. Depending on what source you read, computational intelligence either sits alongside, or is a subset of artificial intelligence. So everything that we were looking to do, as an RPA and Intelligent Automation pure-specialist, was in that space.
CiGen is a pure-play specialist in the realms of Robotic Process Automation (RPA) and Intelligent Automation. That is, it is specialized 100% in automation technology in order to harness its full potential.
For instance, as a pure-play RPA specialist, CiGen recently introduced its RPA Robots for Hire service to complement the existing on-site, full-service model. RPA Robots for Hire will allow companies of all sizes to lease CiGen’s own fleet of robots for the short, medium or long term. In particular, smaller companies can leverage the service as a means to remain cost competitive in their market sphere.
Robotic Process Automation
A very broad definition describes robotic process automation (RPA) as “a useful tool for maximally efficient use of a company’s available resources” (‘tool’ covers both software and services). Efficiency is achieved by cancelling out the risk of error, since risk is significantly higher for dull repetitive tasks performed by humans.
On the other hand, software robots can help reduce costs by being quasi-error-free, fully fatigue-free. This means that they are able to work 24/7, making rules-based decisions and performing repetitive tasks with precision, consistency and speed. They are also highly scalable.
They can thereby improve business competitivity (which is why they are much needed also by small and medium businesses) and results accuracy. As Daniel Pullen, General Manager at CiGen puts it, “software robots ensure that end users are receiving accurate and current data”. They also enable high level communication, foster human-machine collaboration, support compliance modernisation, and more.
The most important trait of RPA is functionality, operationalised in terms of capacity to facilitate an optimal resource allocation leading to best possible business results, all other things held constant.
- Processes that are manual and repetitive. Processes that involve a high degree of manual input, structured and repetitive input or activities that are susceptible to human error.
- Rules based. Activities with clear processing instructions (template driven), with decision making based on standardised and predictive rules.
- Low exception rate. Activities with a low rate of variable outcomes that would otherwise require complex human exception handling.
- Readable inputs. Processes requiring readable input types, including text based data, user interface (UI) activities (keyboard strokes, mouse clicks, etc.), Optical Character Recognition (OCR) and green screen. Processes can operate within any application or environment, such as desktop, Citrix, web, server, etc.
- High volume/ high frequency. High transaction volume processes (including batch processes), such as those that run end of day and end of month. High frequency processes, such as those that run intr-daily, daily, and weekly.
- Mature and stable. Processes that are stable, documented and predictable within the business, with operational costs that are consistent and well-defined.
- Measurable savings. Our recommendation is to commence automation with processes that can be evaluated against a known cost/ time base. The cost savings or benefit gained can typically be expressed in terms of greater accuracy, faster response times, reduced staffing costs and higher productivity from reallocated staff.
- Automation to avoid. We strongly recommend you avoid automating processes that are either marked for rework or those that will continually change over the short-medium term.
In the context of robotic process automation (RPA), “hardening” refers to ensuring that an automated process is secure. There are multiple methods available for process hardening in order to prevent security risks, such as strengthening passwords, installing firewalls, ensuring protection against malware and Trojans, using encryption, etc. Encryption, for instance, deals with protection of the company from external malicious attacks. High level encryption protocols protect the management details of the credential vault.
It is worth to remember that robotic process automation will render business operations less hazardous overall. RPA actually lowers security-related efforts associated with training employees and teaching them security practices (e.g. password management, applications of privacy settings) because it ensures a zero-touch environment. By eliminating manual work, automation minimises security risks at a macro level.
For the sake of security, RPA should be wisely implemented. ‘Wise implementation’ basically amounts to a choice of a stable RPA product or provider, backed by proper, constant monitoring of security measures. Providing role-based access to confidential data and data encryption are the most salient means to deal with security risks.
A crucial gain of RPA is that software implementation doesn’t require programming skills. Once a program has been set up by a technically proficient person, the robot only needs some programming instructions regarding the programs that it should use, and the precise way to use them. By means of a Graphical User Interface (GUI, which is provided by the RPA vendor), anyone should be able to program the bot in the above sense.
According to Daniel Pullen, CiGen’s General Manager, “an RPA tool is very much designed to allow non-coders and non-programmers to feel like it’s accessible, it’s a tool they can use.”
“Computational Intelligence” (CI) is used as an umbrella-term, the scientific community of computer science lacking consensus regarding a commonly applicable definition. The term was coined by Bezdek (1994), who used is as a an attribute of systems that approximate human performance in processes like pattern recognition, signal processing, classification and regression, etc., but that deal only with low-level data (e.g., numerical data). As opposed to AI which was concerned with hard computing, CI is underpinned by soft computing methods. That is, its focus are inexact solutions to computationally intractable or ill-defined problems.
These aspects still pertain to the core features of CI. Duch (2007) defines Computational Intelligence as “a branch of computer science studying problems for which there are no effective computational algorithms”. CI methods aim to mimic human outcomes of “low-level cognitive functions: perception, object recognition, signal analysis, discovery of structures in data, simple associations and control”. More complex tasks, like sequence learning, reinforcement learning, machine learning or distributed multi-agent systems, require use of methods approximating both low- and high-level cognitive functions.
According to Andrew Ng from the Stanford University defines Machine Learning (ML) as “the science of getting computers to act without being explicitly programmed.” ML algorithms support data parsing and learning by generalization, without having to rely on rules-based programming. Because of this, it is widely believed to provide an optimal path towards human-level Artificial Intelligence.
ML methods support both supervised (e.g. neural networks, kernels) and unsupervised learning (e.g., clustering, deep learning). Learning algorithms have numerous applications, like speech recognition, effective web search via text understanding, construction of smart robots (with control and perception capabilities), computer vision, etc.
Natural Language Processing (NLP) is an interdisciplinary branch of AI, at the intersection of computational linguistics and computer science, concerned with making computers able to interpret (i.e., understand) and produce natural language undistinguishable from what humans can do.
Its function is, by and large, that of a mediator between human verbal communication and computers. Speech recognition, natural language understanding and generation are the main areas of specialization in NLP. Supervised and unsupervised learning (in particular deep learning) are the most common techniques used for natural language modeling, which is a necessary condition for for further language processing.
Text-to-speech, speech recognition, discourse processing (analysis, summarisation), natural language understanding and generation, parsing, terminology extraction, are just some among the numerous NLP techniques available. These techniques have a very large application area in today’s business, e.g. text analytics, which is more and more widely used to extract trends from customer feedback, thereby improving customer service.
Unstructured data is information whose format and/or manner of organisation can’t be fit in a typical relational database. It helps understanding to view it in opposition with the structured transaction data in financial systems or other business applications. Such data has a rigid, and uniform format, thereby allowing consistent search, processing, and analysis.
Unstructured data can be textual or non-textual. Emails are the typical case of textual unstructured data. Although the subject, date, sender and recipient details, etc. have specific formats, the email body is unstructured. Overall, the email format is not uniform.
The analysis of unstructured data, like log files from applications, servers, and websites, or sensor data, is becoming more and more of a necessity in the business environment. Customer analytics, sentiment analysis, predictive maintenance, etc. are methods drawing on NLP, which help to identify trends in customer behaviour, capacity limitations, performance bottlenecks, or to deal more effectively with regulatory compliance.