technology domain

Computing

26
applications
20
stories
15
methods
updatedMar 31, 2021
image

Guilherme Henrique @ Envisioning

All computing is based on the coordinated use of computer devices, called hardware, and the computer programs that drive them, called software, and all software applications are built using data and process specifications, called data structures and algorithms.
All computing is based on the coordinated use of computer devices, called hardware, and the computer programs that drive them, called software, and all software applications are built using data and process specifications, called data structures and algorithms.

As a rule-of-thumb, computation is the process by which information is transformed in order to produce a desired result. Such processes are performed by hardware boxes which, if you look closely, are indeed 'smart rocks', namely machines built on a rock-and-metal paradigm. Miniaturized and low-cost components were made predictable by Moore's law, according to which the number of components of integrated circuits double every two years. But the law reached its limits, pushing for paradigm shift, either material or architectural. With regard to the elemental part of computer chip, transistor current term is expiring, giving way to memristor, a component which acts as a resistor, but is capable of remembering the electric charge that flowed through it, mimicking the memory of biological synapses. This is a major energy-saving breakthrough for computing.

General Purpose Technology

As all institutions and relations of production rely on the support of a general purpose computational infrastructure, the implications in what it refers to energy consumption are not trivial. To get the full picture of this situation, the largest data center in the world belongs to China Telecom, a high-tech behemoth that devours 150MW of energy, or five times as much consumed by a Fugaku supercomputer, an extremely electricity hungry beast. The answer to this is to rely on smart energy-efficient computation fueling their racks and the racks of newborn Data Enclave, secure networks through which private data can be stored and disseminated. The same is true to synthetic media detection tools such as Deep Fake Detection Tool, a technology that may left public trust in tatters. And on and on. The days of conventional computing are numbered, but there are alternatives.

Besides the energetic aspect, to avoid the breakdown of our socio-technical system, we can –and should– demand optimal computational power, beyond the drawbacks of traditional computers. This means that the only possible option is quantum computing, a drastically new physical process that brings to 'computing as we know it', executed on the combination of two separate states (ones and zeros), the principles of wave-particle duality and entanglement, overlapping those states in qubit units. Without going into engineering details, one cannot encode a qubit of information in a classical transistor, which operates sequentially. The quantum computer goes in the direction of parallelism.

Not all is rosy, however. Their processor and associate components need to be cooled down to temperatures colder than outer space. And they are restricted, until now, to expensive and well-equipped laboratories, funded by large enterprises, as Google, which today yearns for quantum supremacy with its 53-qubit Sycamore.

Cultural Automation?

As anthropology sees it, there are three forms of intelligence: natural, artificial and cultural. The first two have already been automated, but what about the last? Would it be possible to automate knowledge processes? Historically, automation never replaced human cognition. But what if you bring into the equation more computing power? Hyperautomation, also known as cognitive automation, is a branch of AI (artificial intelligence) which warrants close attention. Confusion abounds around the area, being it constantly associated with Robotic Process Automation (RPA), a solution aimed at intelligent software robots.

But what matters most in this nascent technological concept is how it can be pre-trained to automate not only specific business processes, but also processes of acquiring knowledge and understanding. If supplied with enough computational power, it could drive meaningful conclusions from unstructured data. This approach could be used for example in cultural analytics, a method for researching many types of visual media, as comics, webcomics, video games, virtual worlds, video, films, cartoons, motion graphics, print magazines, paintings, and photographs.

What Lies Ahead

Bio-inspiring Computing

When we evoke computing, what comes to mind immediately are digital computers, whose small calculation inaccuracies are forgiven because of their incredible speed. The problem is that they are unable to solve many of the problems in polynomial time, that is, with a viable operational cost and with a feasible resolution time. Instead of neural machines, such as ISING, scientists today turn to natural computing, building analog machines, and the top-notch are, without a doubt, the 'electronic amoebas'. They are revolutionary in computational terms, because the average time for a combo formed by live amoebas and electronic capacitors to find solutions for complex problems increases in linear time, and not in exponential time typical of digital machines. Amoeba-inspired analog electronic computing systems can solve easily the manyfold traveling salesman problem, or what is the shortest possible route that goes through stopping points, from start to finish, without repeating them. This will have significant implications for several important areas, such as finance, pharmaceuticals, logistics or machine learning.

Brain-inspired Computing

Still on the slippery ground of bio-computing, what would computing be like without computers? Neuromorphic computing and spiking neural networks (SNNs) comes close of this next level, yet there are major differences between deep networks and brains. A toddler needs only see one or two cats to recognize cats for the rest of his or her life, while a deep network needs to 'see' a million pictures of the animal to hopefully recognize it. 'Future of computing is polymorph and elusive', declare Andrew Adamatzky in a editorial on how computation is going to be like in 2065. Asked if he could be less ambiguous, the British computer scientist replied without hesitation, 'with regards what computer will do in the future, there will be no computers. Our brain is a powerful enough computer to do all necessary calculations'. This calls for a wait-and-see attitude. Time will tell.

26
applications
20
stories
15
methods

Methods

method
Maximum Likelihood Classification (MLC)

A method of image analysis and classification based on an algorithm that classifies each pixel in a raster with the maximum likelihood of corresponding to a class. This method is widely used in remote sensing. When satellites collect information from Earth, the data is organized in a raster, or a matrix of cells, with rows and columns containing values representing information. For instance, land-use and soil data can be crossed with continuous data such as temperature, elevation, or spectral data. After performing a maximum likelihood classification on a set of raster bands, a classified raster is algorithmically created as the output according to pre-trained data. This helps to improve multi-layer remote sensing imaging and supports spatial analysis.

method
Satellite Image Processing (SIP)

A data processing method that efficiently processes a large collection of satellite images to produce actionable information for decision-making procedures. By consolidating spatial, temporal, spectral, and radiometric resolution data gathered from open-access satellites, this method aims to follow land-use management in a timely and accurate manner. This is done by applying sequential image processing, extracting meaningful statistical information from agricultural fields, and storing them in a crop spectrotemporal signature library.

method
Building Information Modeling (BIM)

A 3D model-based process mounted on AI-assisted software that enables simulations from physical structures. The 3D model enables the design, simulation, and operation of what-if scenarios through virtual representations. Building Information Modelling depends on the physical and functional characteristics of real assets to provide nearly real-time simulations. The data in the model defines the design elements and establishes behaviors and relationships between model components, so every time an element is changed, visualization is updated. This allows testing different variables such as lighting, energy consumption, structural integrity, and simulating changes on the design before it is actually built. It also supports greater cost predictability, reduces errors, improves timelines, and gives a better understanding of future operations and maintenance.

method
Yield Mapping

One of the most used ways to define, quantify, and characterize the within-field variability in crop production. It is accomplished by combining geospatial data with real-time information from yield monitors mounted on combine harvesters. Variables such as mass, volume, and moisture are collected, allowing to observe both spatial and temporal yield variation within a field, helping farmers decide which parts of the area need more water or fertilizers.

method
Spatial Optimization

This computational and mathematical method uses geographic data to support decision-making processes in diverse geographic fields, including natural resource management, land‐use science, political geography, transportation, retailing, and medical geography. Spatial optimization aims to maximize or minimize an objective linked with a problem of geographic nature, such as location-allocation modeling, route selection, spatial sampling and land-use allocation, among others. A spatial optimization problem is composed of decision variables, objective functions, and constraints. Geographic decision variables include spatial and aspatial focuses. The topological and nontopological relations between decision variables like distance, adjacency, patterns, shape, overlap, containment, and space partition are reflected in the objective functions.

method
Spatial Computing

Computing approach that allows machines to make sense of the real environment, translating concrete spatial data into the digital realm in real-time. By using machine learning-powered sensors, cameras, machine vision, GPS, and other elements, it is possible to digitize objects and spaces that connect via the cloud, allowing sensors and motors to react to one another, accurately representing the real world digitally. These capabilities are then combined with high-fidelity spatial mapping allowing a computer “coordinator” to control and track objects' movements and interactions as humans navigate through the digital or physical world. Spatial computing currently allows for seamless augmented, mixed and virtual experiences and is being used in medicine, mobility, logistics, mining, architecture, design, training, and so forth.

method
Information Pooling

A crowdsourcing additive aggregation method that integrates distributed information sent by contributors, such as votes, GPS location, assessments, predictions, and opinions through approaches such as summation, averaging, or visualization. Information pooling is useful for gathering location-based information, evaluating and selecting alternatives, eliciting and validating customer or citizen needs, forecasting, or market research.

method
Microtasking

A crowdsourcing approach that breaks a massive project into a series of smaller tasks that only take a few minutes for each contributor to complete, similar to a distributed task-force. This method functions as a scalable effort and time-efficient batch processing of highly repetitive tasks such as categorizing data and translating or correcting texts. Microtasking platforms work with pre-determined, qualitatively equivalent, and homogeneous contributions produced from highly repetitive tasks.

method
Neural Machine Translation

A machine translation method that applies a large artificial neural network to improve the speed and accuracy of probability predictions of a sequence of words, often in the form of sentences. Different from statistical machine translation, neural machine translation trains its parts in an end-to-end basis, performing the analysis in two stages; encoding and decoding. In the encoding stage, source language text is fed into the machine then transformed into a series of linguistic vectors. The decoding stage transfers these vectors into the target language.

method
3D Modeling

The process of creating a three-dimensional representation of any surface or object (inanimate or living) via specialized software. 3D modeling is achieved manually with specialized 3D production software that allows for creating new objects by manipulating and deforming polygons, edges, and vertices or by scanning real-world subjects into a set of data points used for a digital representation. This process is widely used in various industries like film, animation, and gaming, as well as interior design, architecture, and city planning.

method
Honeypot-based Social Engineering Defense

A cyber-defense mechanism where fake persona decoys are used to entrap and deceive attackers who use social engineering as a strategy for cyberhacking. A honeypot is a computer security tool that imitates existing infrastructure to trap attackers and learn their behavior. For social engineering defense, a so-called “social honeypot” involves creating fake users on social media websites or fake employees in the company database. By acting as a decoy user, they try to entrap attackers. Since all communication with the honeypot is entirely unsolicited, there is a high chance of the initial contact being spam or an attack. When the honeypot receives a message, the sender's profile and social details are harvested, and machine learning with statistical analysis is used to classify whether the sender is malicious or benign. The result of this classification is automatically propagated to the devices of all real employees, which then automatically blocks further communication attempts from the offending party.

method
Edge Computing

A distributed computing method of processing and analyzing data along a network edge, placing the workload closer to the data gathering process. With edge computing, data does not need to be sent to the cloud to be processed and sent back to the device. Edge computing is mounted with machine learning algorithms, enabling decision-making without relying entirely on cloud computing. By allowing for data processing locally, this approach offers low-latency responses while saving bandwidth and improving security, all necessary attributes for industry 4.0 appliances, such as factory machines, autonomous vehicles, and overall IoT ecosystems.

method
Natural Language Processing (NLP)

In the intersection of artificial intelligence, computer science, and linguistics, this method makes machines understand human speech in text or audio format by evaluating the meaning and significance of words while completing tasks involving syntax, morphology, semantics, and discourse. By using statistical inference algorithms, it is possible for the machine to automatically learn rules through the analysis of large sets of documents. Since human speech is not always precise and often ambiguous, NLP is key in the progress of human-machine interaction applications such as virtual assistants, automatic speech recognition, machine translation, question answering, and automatic text summarization.

method
User-Defined Data Sharing

A method that forces websites to make sole use of data in ways that the user explicitly approves. This intermediary is situated between the user and the web browser, intercepting all connections to assure it respects the predefined policies that the user is only sharing information within their explicit consent. Using simple policy language, users define restrictions on how a remote service could process and store sensitive data. If the website meets the standards, the proxy releases the user’s data.

method
Real-Time Mapping

A mapping method that geographically pinpoints and stores where certain events are occurring in real-time based on crowdsourced data collected from Mobile Crowdsensing Platforms. The crowd-sourced mapping approach is processed through a website or a mobile application that does not require registration to report the event. This method is used to map happenings like weather changes, maintenance problems, accidents, fraud, cyber-attacks, sexual harassment, and other crimes. Also, the crowdsourced data fed into the platform allows processing, modeling, evaluation, and, finally, prediction of events in specific areas, thus supporting decision-making processes for citizens, city planners, and policymakers.

Future Applications

Associated technology applications with TRL lower than or equal to 6. Future applications cover concept, proof-of-concept, validation, and prototype stages and are less technically developed compared to Current Applications.
Associated technology applications with TRL lower than or equal to 6. Future applications cover concept, proof-of-concept, validation, and prototype stages and are less technically developed compared to Current Applications.