AI Helps Predict ‘Cytokine Storms’ in COVID-19 Patients

COVID-19 not only has a range of symptoms, but also varying impact on patients. In its early stages, doctors are unable to predict the progression of the disease because it depends on the interaction between the viral infection, the patient’s response, and the development of cardiovascular inflammation. To address that problem, a cross-disciplinary program spearheaded by EPFL seeks to empower both patients and doctors with an assistive, predictive, and personalized healthcare tool called Digipredict.

Digipredict is a pan-European research program involving universities, hospitals, and startups. It aims to develop a digital twin that can detect serious complications in COVID-19 patients by using breakthrough technology in the fields of artificial intelligence, smart patches, and organs-on-chips. Playing a key role in the project are EcoCloud faculty members David Atienza and Martin Jaggi, who also head the Embedded Systems Laboratory and the Machine Learning and Optimization Laboratory respectively. The two other EPFL laboratories involved in the project are the Nanoelectronic Devices Laboratory and the Laboratory of Movement Analysis and Measurement.

Digipredict detects the first signs of a ‘cytokine storm’ in high-risk COVID-19 patients, thus allowing doctors to act before it causes serious damage to the cardiovascular systems. Cytokines are proteins that play an important role in normal immune responses, but having a large amount of them released in the body all at once can be harmful.

Digipredict predicts disease progression by using a smart patch with integrated technology for collecting crucial medical data such as blood oxygen levels, breathing rate and body temperature. Nanosensors linked to an AI program track specific biomarkers that indicate any possibility of a cytokine storm. The data collected and analyzed by Digipredict allows doctors to make informative and timely decisions about the course of treatment.

Apart from EPFL, other institutions involved in the project are the University of Twente, ETH Zurich, IMEC in Belgium, Stichting Imec in the Netherlands, the Charité university hospital in Berlin, the University of Bern (through Inselspital), and three startups (Ascilion, EPOS-IASIS, and SCIPROM). The Center for Intelligent Systems (CIS) will be responsible for promoting Digipredict and disseminating its findings.



Read More

Pascal Frossard Named as Full Professor (STI)

The Board of the Swiss Federal Institutes of Technology has announced the appointment of Pascal Frossard as Full Professor of Electrical Engineering and Electronics in the School of Engineering (STI). Currently Associate Professor at EPFL, he joined the EcoCloud faculty in 2018 to help the research centre drives its cloud computing programs.

Before joining EPFL in 2003, Professor Frossard was a member of the research staff at the IBM TJ Watson Research Center at Yorktown Heights, NY, USA. His core research areas include interpretable machine learning, data science, graph signal processing, image representation and analysis, computer vision and immersive communication systems. His research contributions include the analysis of geometric properties of deep networks, deep nets robustness analysis, and representation learning for graph signals.

Professor Frossard has won international acclaim for expertise in signal and image processing, and its applications in intelligent systems and biomedicine. He is known for an innovative research approach that combines different disciplines in the natural and engineering sciences and facilitates essential academic and industrial partnerships. Over the years, he has won many distinctions. These include, inter alia, the Swiss National Science Foundation Professorship Award in 2003, IBM Faculty Award in 2005, IBM Exploratory Stream Analytics Innovation Award in 2008, Google’s 2017 Faculty Award, IEEE Transactions on Multimedia Best Paper Award in 2011, and IEEE Signal Processing Magazine Best Paper Award in 2016. He is a Fellow of IEEE. His work has been widely published in reputed journals. In his most recent publication,* the authors introduce a representation learning algorithm for graphs, which simultaneously learns a low-dimensional space and coordinates for the nodes in that space.

Professor Frossard’s appointment as Full Professor will undoubtedly strengthen EPFL’s key strategic research areas.

* Simou, Effrosyni & Thanou, Dorina & Frossard, Pascal. (2020). node2coords: Graph Representation Learning with Wasserstein Barycenters. IEEE Transactions on Signal and Information Processing over Networks. PP. 1-1. 10.1109/TSIPN.2020.3041940.



Read More

New “Sight and Sound” Algorithms to Identify COVID-19 Patterns

About a year ago, when the novel coronavirus broke out, medical science not only failed to arrest its spread but also to properly identify the developmental stages of the disease. Many casualties resulted because the progression of the disease was an enigma. In the later part of the year, there were nascent attempts to harness AI for COVID-19 diagnosis, treatment, and monitoring. A giant step in that direction has been taken recently by researchers at EPFL; they have developed algorithms that can practically see and hear COVID in a patient’s lungs.

The new deep learning algorithms DeepChest and DeepBreath have been developed by the team of Dr. Mary-Anne Hartley at EPFL’s intelligent Global Health group (iGH) based in the Machine Learning and Optimization Laboratory of Professor Martin Jaggi. DeepChest uses lung ultrasound images, while DeepBreath utilizes breath sounds from a digital stethoscope. The algorithms can accurately diagnose the novel coronavirus in patients and predict how much they will be affected by the virus strain.

Two nearby Swiss university hospitals are involved in developing the algorithms. At HUG, the Geneva University Hospitals, Professor Alain Gervaix has been collecting breath sounds since 2017 to develop an intelligent digital stethoscope called the “Pneumoscope” to diagnose pneumonia. The recordings have now helped develop the DeepBreath algorithm at EPFL. Initial trials show that DeepBreath can detect even asymptomatic COVID by identifying changes in lung tissue before the patient becomes aware of them.

The clinical aspects of the DeepChest project are being conducted at CHUV, Lausanne’s University Hospital. Thousands of lung ultrasound images are being collected from patients admitted to the Emergency Department with COVID-19 symptoms. Although the image sample collection process started last year, they have since focused on COVID-19.

The algorithms are available on the EPFL website, but they are very much a work in progress. Efforts are on to further refine and validate the algorithms by inviting coding skills from around the world, including a year-long hackathon called ‘CODED-19’ by the EPFL community. As Professor Jaggi explains, “This AI is helping us to better understand complex patterns in these fundamental clinical exams. So far, results are highly promising.” At a later stage, iGH plans to develop a mobile application for the deep learning algorithms and make them available far and wide.

While the current effort is to specifically meet the COVID-19 challenge, its preliminary success has amply demonstrated how large-scale AI research can be used to remove some of the roadblocks for medical science.


Read More

EPFL Scientists Spearhead Deep Learning Efforts to Clear Space Debris

[et_pb_section][et_pb_row][et_pb_column type=”4_4″][et_pb_text]

The space around the Earth is home to abandoned satellites, rocket boosters, and other bits of space debris that have accumulated over the years. It is estimated that there are tens of thousands of space debris in low Earth orbit (LEO). At least 34,000 orbiting pieces of junk are larger than 10 cm each, which make them a major threat because they are orbiting the Earth at very high speeds. There were two near collisions recently, which could have been disastrous: In September, the ISS was forced to conduct a maneuver to avert a possible collision with an unknown piece of space debris, while there were high chances of a Chinese ChangZheng-4c rocket crashing into an old Soviet Parus navigation satellite in October.

With numerous missions being conducted by several space organizations, space must be made a safer place for all future explorations. Toward that objective, technologies are being developed to capture and deorbit space debris. One such pioneering mission is ClearSpace, a spin-off stemmed from the EPFL Space Center (eSpace). The first mission of the project, ClearSpace-1 scheduled for 2025, is to recover the now obsolete Vespa Upper Part, a payload adapter orbiting 660 kilometers above the Earth that was once part of the European Space Agency’s Vega rocket.

At least three EPFL laboratories are working in tandem on the project: Computer Vision Laboratory led by Professor Pascal Fua, Realistic Graphics Lab led by Assistant Professor Wenzel Jakob, and Embedded Systems Lab spearheaded by Professor David Atienza (also a faculty member at EcoCloud).

ClearSpace-1 aims to extricate the debris from space by enabling the robotic arms of a captured rocket to approach the Vespa from the correct angle, grasp the object, and pull it back into the atmosphere. That calls for the development of deep learning algorithms to estimate the 6D pose (3 rotations and 3 translations) of the target based on video-sequences. That’s where the Computer Vision Laboratory comes in with inputs from Mathieu Salzmann, the lead scientist working on the project.

However, the scientists must allow for physical changes in the Vespa over the seven years of its space wanderings. To estimate its current appearance for Professor Salzmann, the Realistic Graphics Lab is generating a database of synthetic images of the target object, which includes a detailed 3D model of the Vespa upper stage.

The nature of the project requires the scientists to ensure completion of the mission in space with limited computing power onboard. The Embedded Systems Lab is leading that segment of the work of transferring the deep learning algorithms to a dedicated hardware platform. As Professor Atienza explains, “Since motion in space is well behaved, the pose estimation algorithms can fill the gaps between recognitions spaced one second apart, alleviating the computational pressure.” However, he adds, “…the algorithms are so complex that their implementation requires squeezing out all the performance from the platform resources.”

Therefore, the key to the success of the mission is to design algorithms that are 100% reliable in space. And therein lies the challenge, as well as the inherent attraction, of the entire project for the scientists at EPFL.




Read More

Anne-Marie Kermarrec Joins EcoCloud

EcoCloud is happy to announce that Anne-Marie Kermarrec, Professor in the School of Computer and Communication Sciences at EPFL, has now come aboard the EcoCloud faculty.

Anne-Marie Kermarrec is a widely acclaimed computer scientist with deep academic and business experience. After completing her PhD thesis at Rennes (France) in 1996, she worked with the computer science group at Vrije Universiteit in Amsterdam for a short while before taking up the position of Assistant Professor at the University of Rennes. Between 2000-2004, Anne-Marie was part of Microsoft Research in Cambridge. For the following decade, she worked as a Research Director at INRIA in Rennes. In 2015, she founded Mediego, a startup company for online personalized predictive services that directly leveraged her research work.

Anne-Marie has won several honors and awards. She received an ERC grant for GOSSPLE (2008-2013) and an ERC Proof of Concept for the new recommender AllYours (2013). She received the Michel Monpetit Award (2011) and the Innovation Award (2017) from the French Academy of Science. She has been elected to the European Academy in 2013 and named ACM Fellow in 2016. She has also been on the program committee of many workshops and conferences and served as Chair of the ACM Software System Award for a number of years.

Since January 2020, Anne-Marie has made a significant academic contribution as Professor at EPFL. Her research investigates large-scale distributed systems and more precisely P2P systems, epidemic algorithms, distributed infrastructures for machine learning, and privacy-aware personalization systems.

While welcoming Anne-Marie Kermarrec to the EcoCloud faculty, we are sure that her international stature as a hardened researcher will amplify the center’s activities and its leading academic role in the global arena.


Read More

Hédi Fendri Wins OMEGA Prize Student Award

Each year, a student of the Microengineering Section of EPFL is awarded the prestigious OMEGA Student Award for contributing to scientific and technological advances in the disciplines of Microengineering, Micro- and Nanotechnologies, and Chronometry. This year, the Board of the Foundation of the OMEGA Prize decided to honor Mohammed Hédi Fendri, a master student at the Institute of Microengineering in EPFL’s School of Engineering.

Hédi’s Master’s thesis “ML-Based Side-Channel Analysis and Disassembly of Hardware Root of Trust” is based on his findings in the industry itself. His research crossed path with the Kudelski Group, a Swiss company that has established itself as a global leader in digital security and convergent media solutions for the delivery of digital and interactive content. At Kudelski, Hédi was supervised by Jérôme Perrine and Marco Macchetti, Director (Engineering) and Principal Engineer, respectively. Hédi’s academic supervisor was Mirjana Stojilovic, a faculty member at EcoCloud.

In his thesis, Hédi devised a methodology for assessing the side-channel vulnerability of a device at design time and demonstrated its efficiency on a RISC-V CPU design, simulating the execution of the AES-128 encryption. The thesis also presents a novel side-channel analysis disassembler based on dictionary learning and sparse coding. Unlike previous works, which targeted 8-bit microcontrollers running on frequencies up to 16 MHz only, the proposed ML-based method can successfully recognize all instructions of a 32-bit RISC-V CPU running on frequencies as high as 100 MHz.

Hédi’s findings are both important and timely. With the rise of the IoT, smart home appliances, embedded medical devices, and other numerous IoT devices have become an attractive target for adversaries. Hédi’s findings enable significant reduction in cost and time necessary to accurately assess the side-channel vulnerability of an IoT device well before its deployment.

Laureates of the OMEGA Student Awards receive 1000 Swiss francs, usually in addition to an OMEGA watch. The award ceremony is held concurrently with the graduation ceremony for Master’s degrees in Microengineering at EPFL. The Board determines the number of awards every year; Hédi is the sole winner in 2020. We congratulate Hédi on receiving this award and wish him every success in his future endeavors.

Read More

Award Ceremony: ICCAD 2020 Ten Year Retrospective Most Influential Paper Award

The ICCAD Executive Committee has recognized “3D-ICE: Fast Compact Transient Thermal Modeling for 3D ICs with Inter-Tier Liquid Cooling” as “the most influential on research and industrial practice in computer-aided design of integrated circuits over the ten years since its original appearance at ICCAD.” The authors of the paper are Arvind Sridhar, Alessandro Vincenzi, Martino Ruggiero, David Atienza (all from the Embedded Systems Laboratory – ESL at EPFL), and Thomas Brunschwiler (IBM Zurich Research Laboratory).



Read More

Fighting Digital Disinformation: Is it a Case of “Too Little Too Late”?

Fake news on social media has been around for a long time, but it came to the fore in the wake of the 2016 US presidential elections. The alleged interference by Russia in driving social media toward the candidature of Donald Trump showed how so-called ‘news’ on Facebook and other social media platforms can be dangerously twisted to malign candidates and influence voters. With the 2020 elections just weeks away, the ugly specter of digital fake news figured prominently in a recent conference held by EPFL’s Center for Digital Trust (C4DT).

Experts on cyber security, fake news, and democracy called for an awakening among citizens and governments so that urgent heed is paid to this growing menace, which threatens the very essence of democracy. Dr. ​Rebekah Overdorf, a Postdoctoral Research Associate at EPFL’s Distributed Information Systems Laboratory, raised concerns about the manipulation of social media by vested interests. However, she emphasized that the problem is not US- or West-centric; rather, it is a global phenomenon. This is proven by her ongoing research on cyber disinformation and election interference in Kyrgyzstan, sometimes referred to as ‘the island of democracy.’ Following elections on October 4, there was a public outcry against widespread manipulation and voter suppression, which ultimately forced the prime minister to step down. Collaborating with local journalists, students and activists, Overdorf identified three methods of information manipulation: creating noise through fake social media accounts to drown out opposition, spreading disinformation through fake news to destroy trust, and performing malicious personal attacks.

Karl Aberer, faculty member of EcoCloud and Full Professor at the Distributed Information Systems Laboratory, went a step further by addressing the root of the problem: insufficient quality information and a surfeit of free, but often manipulated, news on social media. Digital literacy is the need of the hour, but time may be running out. Drawing a parallel with climate change, Professor Aberer said that we might soon be reaching a “point of no return” when democracies might be beyond redemption.

These alarming thoughts were echoed by Steve El-Sharawy, head of Insights at EzyInsights. He said that global manipulation is already beyond control. Although Facebook has deleted billions of fake accounts, it could only be the proverbial tip of the iceberg with many fake accounts still propagating misinformation on social media.

The conference, “Manipulating elections in cyberspace: Are democracies in danger?” was a joint C4DT – CyberPeace Institute – CTEI event that can be viewed at https://youtu.be/PVbwfO-qri4%20Target

Read More
VLDB Women in Database Research Award 2020

Celebrating the Woman Researcher: Anastasia Ailamaki Receives VLDB Women in Database Research Award 2020

In a major recognition of Swiss innovation and excellence in database research, the VLDB Endowment has conferred the prestigious VLDB Women in Database Research Award on Anastasia Ailamaki, EPFL professor and co-founder of Raw Labs. In announcing the award, VLDB has acknowledged Ailamaki’s ‘pioneering research on the interaction between hardware micro-architecture and database engine performance.”

Even as a budding researcher during her undergrad days at a computer engineering school in Greece, Anastasia Ailamaki inspired gender diversity in the academic field. She was one of only 9 female students in a class of 154. Although she never encountered gender discrimination in those days, she admits that it was difficult for her to find a job as a system developer or network manager. That turned to be a blessing in disguise because it encouraged her to pursue her PhD. Today, she is a leading woman achiever not only as head of EPFL’s Data Intensive Applications and Systems Laboratory (DIAS) but also for her excellence in database research. However, she calls on woman researchers not to let thoughts on gender predominate because it’s not about “women or men in science, but just great scientists.”

At EPFL, Ailamaki has worked on data-intensive systems and applications, particularly the interaction between database software and emerging hardware and I/O devices, and automation of data management to support computationally-demanding, data-intensive scientific applications. She is currently working on developing real-time analytics infrastructures (or real-time intelligent systems) that incorporate change as a core premise.

Reacting to the VLDB award, Ailamaki said that her students are her biggest achievement:

“It’s always humbling and fabulous to win awards and I have been fortunate to have my work recognized often, but I feel that my biggest achievement, and most important contribution to my research field, are my students and post-docs. I enjoy working with young people very much and I learn from them as much as I hope that they learn from me.”

For Ailamaki, life is often an extension of her computer science approach, and she applies her systematic computational and algorithmic thinking processes to make decisions in her everyday life. She believes that “when you make a decision with all the data at hand, there are no regrets.”

September had yet another achievement in store for Ailamaki; she was elected as a member of Academia Europaea, an association that aims to advance excellence in science and research for public benefit and education.

Professor Ailamaki’s achievement is yet another distinguished addition to EPFL’s long list of impressive accolades.


Read More

Data Centers Need to Consider their Carbon Footprint

Digital technology is running up against its physical limits. One solution is to build more data centers – but that needs to go hand in hand with a reduction in their carbon footprint.

For reasons you can imagine, much of what we used to do in the physical world is now being done virtually. That’s having an effect on energy-related pollution – CO2 emissions from manufacturing and transportation have fallen drastically, for example. But there has been a concomitant increase in energy use for digital services. Exact numbers aren’t yet available, but according to Babak Falsafi, the director of EPFL’s EcoCloud center, the trend is clear. “Behind every digital service we use lies a data center. And we’re heading towards a world where everything is done digitally, across all markets and industries.”

Falsafi continues: “A lot of business activities have been shifted online because of the pandemic, causing a huge surge in demand, mainly for video. Non-work-related demand for streaming has also exploded. What’s more, today’s ultra-high-resolution screens use up a lot of energy. People don’t understand everything that’s involved in watching a movie in 8k – a lot of power is needed for all that data processing and transfer. You put that all together and it’s huge!”

Relentless rise in demand

The current situation is set to last for a while longer: it’ll be weeks, or probably months, before a vaccine is ready – and that’s without factoring in a second wave. Many organizations, including schools and universities, have announced that they will keep holding classes online, at least partly. But the issue of data-center-related emissions was already a pressing one before the pandemic. “New technology like the internet of things, artificial intelligence, 5G and 4k televisions – which are now going to 8k – has pumped up demand, and therefore energy use,” says Falsafi. According to an article appearing in MIT Technology Review last year, training a single Transformer artificial intelligence model can generate as much carbon emissions as five American cars throughout their useful life. In another example, Netflix announced that its electricity use jumped 84% in 2019 to 451,000 megawatt-hours, or enough to power 40,000 homes in the United States for a year.

Some studies predict that digital technology will account for 8% of global electricity use by 2030 – up from 3–5% today – and 4% of CO2 emissions. This includes data centers, those huge buildings that house the servers we use to store, process, analyze and transfer data on the cloud (the biggest data centers already consume hundreds of megawatts). It also includes, in an equal measure, the telecommunication systems that transport those data. Consumer electronics and the energy used to build computing facilities also play a role, albeit smaller.

The end of Moore’s law

While demand is skyrocketing, supply is bumping up against a ceiling. Moore’s law, which states that the number of transistors contained on a silicon chip doubles every year, has pretty much expired. We can’t keep packing more computing power onto chips like we’ve been doing over the past 50 years. The two options currently available are to build new data centers or expand existing ones. The data-center construction market is expected to swell to 57 billion dollars over the next five years.

Who’s responsible for keeping a lid on digital-related emissions? “Nobody!” replies Falsafi. “Nobody is being held accountable for those emissions. We pay telecom operators for our internet connections, for example, but the services we use on the internet – like Google and Facebook – are free. And those service providers aim to collect as much data about us as possible to improve their services. At no point in this arrangement are carbon emissions taken into account, since power use is measured at data centers and not on telecom networks.” Edouard Bugnion, EPFL’s Vice President for Information Systems, adds: “Data centers are basically technological advancement wrapped up in a consumable format. They are the vector by which cyberspace can develop. Google wouldn’t exist without data centers. Neither would much of the research conducted at EPFL.”

Towards more sustainable data centers

Engineers at EPFL’s EcoCloud have been working since 2011 to find a way around the supply ceiling. Their approach involves not only making data centers more efficient, but also reducing their carbon footprint. Nobody cared much about this latter aspect when the centers were first being built – but times have changed. “There are three things that need to be factored into the equation. First, the energy efficiency of data centers, which needs to be improved. Second, the CO2 that’s emitted to run them, which can be reduced by switching to renewable energy. And third, the energy the servers give off in the form of heat – can’t we do more with that heat than open a window and warm up the parking lot?” says Bugnion.

EcoCloud, along with the other members of the Swiss Datacenter Efficiency Association (SDEA), introduced an energy-efficiency certification system in early 2020. Their program – called the SDEA Label (see article) – quantifies how much CO2 per kWh data centers emit, with the goal of encouraging operators to use renewable energy. EcoCloud also scouts opportunities for EPFL labs to work with businesses to develop advanced systems for cooling, energy management, energy storage and power generation – all within a local innovation ecosystem designed to help data center operators shrink their carbon footprint.

New Certification System Encourages Greener Data Centers


EPFL, along with other members of a tech-industry consortium, introduced the world’s first energy-efficiency certification system for data centers in January.

“When we created EcoCloud in 2011,” says director Babak Falsafi, “the goal was to cut data centers’ energy use and CO2 emissions – at the time, IT industry heavyweights cared only about the financial and business aspects. We developed pioneering technology that brought renewable energy into the data center ecosystem.” His research center aims to spur innovation across the ICT sector – from algorithms to infrastructure – to help meet today’s major challenges.

And data centers will play a growing role in those challenges as people rely more and more on digital technology. The amount of power consumed by data centers is set to expand rapidly, and by 2030 could account for 8% of global electricity use.

Making data centers carbon-free

To help keep that electricity use in check, a consortium of Swiss tech-industry organizations created the Swiss Datacenter Efficiency Association (SDEA). The initiative was spearheaded by digitalswitzerland and Hewlett Packard Enterprise (HPE); members include EcoCloud, HPE, Green IT Switzerland, the Luzern University of Applied Sciences and Arts (HSLU), the Swiss datacenter association (Vigiswiss) and the Swiss telecom industry association (ASUT). The initiative is also being supported by the Swiss Federal Office of Energy (SFOE) through its SwissEnergy program.

In January, the SDEA introduced a green certification system specifically for data centers. The system involves calculating data centers’ carbon footprint based on the energy efficiency of the building and IT equipment, as well as the IT equipment’s power load. “Until now, there was no way to measure data centers’ impact on CO2 emissions,” says Falsafi. “Our certification system is unique because it also factors in the source of the power used and how well heat is recovered. Everything is connected – if a data center uses renewable energy, its performance improves.”

The SDEA uses three certification levels (bronze, silver and gold) to encourage data center operators to cut their power consumption. Pilot tests at ten sites in Switzerland show that the SDEA’s “toolkit” effectively takes into account their efforts to shift in full or in part to renewable energy.

Arriving at just the right time

Since computer processors are reaching their maximum physical capacity, the only solution for managing the surging amount of data is to build more data centers. “Our certification system comes at just the right time,” says Falsafi. “We hope that it will encourage data center operators to build facilities that run on renewable energy, and stimulate innovation and investment in this field.”

Servers Designed to Work like Humans

Picture of David Atienza

David Atienza believes that when it comes to IT systems, everything that can be done locally, should be. That includes processing data where they are generated – thereby substantially reducing the amount of power required.

EPFL’s Embedded Systems Laboratory (ESL) is studying two major energy-related problems with servers. The first is that they aren’t being used anywhere near their maximum capacity. Actual use is closer to 60%, according to ESL head David Atienza. “Servers are designed for tasks that require a lot of processing power – such as running neural networks – but they’re being used mainly for watching movies or sending pictures via chat,” he says. As a result, they overheat. “It’s like driving a Ferrari at 40 km/h – it would burn up a lot more energy at that speed than a Twingo would,” he adds.

The problem is that even if the servers are the only equipment that overheats in a data center, operators still have to cool the whole center. To help find a solution, Atienza is working on the Compusapien research project, for which he received an ERC Consolidator Grant in 2016. His team found that cooling servers locally can cut a data center’s power requirement by 40%. They worked with engineers from IBM to develop a system where cooling water is used to lower the temperature of individual servers, as opposed to running fans to cool the entire room. With this system, heat is recovered in the cooling water and reused. The water runs through microfluidic channels that are just 50–100 µm high and sandwiched between two layers on a cooling plate. As the water absorbs heat from the servers, it transfers it to microfluidic fuel cells where it’s converted into electricity. The electricity can then be fed back to the servers as power, reducing the amount of power that the data center draws from the grid.

Processing data locally

“The human brain works the same way. Blood carries nutrients to the brain and cools it. It’s just that with servers, the process is a little more complicated!” says Atienza. While a lot of data centers already use cooling water, his is the first system to use microfluidic fuel cells to recover heat and turn it into electricity. The technology – nicknamed “electronic blood” – was tested in a 3D prototype developed in association with IBM, and proved to be feasible from a technical point of view. Now ESL engineers are building a 3D version of the integrated system and plan to develop a full server through a joint project with another company.

The second component of Atienza’s approach is to process data as locally as possible since transmitting them takes up a lot of energy. One example of this approach is a next-generation Nespresso machine that the ESL team developed. Their machine uses an embedded artificial intelligence system to manage maintenance and restocking completely on its own. “More and more applications – especially those for smartphones – operate locally and don’t go through data centers,” says Atienza. “That’s a lot like the human body. Our bodies have lots of tiny modules that can carry out two or three simple functions; the brain gets involved only when important decisions need to be made. That’s a lot more efficient than today’s data centers where everything runs all the time.”

Using Mathematics to Manage Big Data

VLDB Women in Database Research Award 2020

Most organizations run a huge variety of computer software and hardware, which bogs down their IT systems and wastes energy. But Anastasia Ailamaki has a found a solution that works by giving all the different system components a common language.

“Sustainability means coming up with solutions for problems as efficiently as possible while using fewer resources,” says Anastasia Ailamaki, a professor at EPFL’s Data Intensive Applications & Systems Laboratory (DIAS) and founder of local startup RAW Labs SA. “And so we can say that our research is directly related to sustainability.” The engineers at DIAS are developing a data management system that makes as much use as possible of an organization’s existing hardware and software – a real challenge given the wide variety of hardware and software out there.

“Hardware that’s turned on but not used is a waste of energy,” says Ailamaki. In the same way that our bodies burn calories even when we’re just sitting there, computers burn up a considerable amount energy even when they’re idling. “Most computers are used to only 20% of their potential. It’s like if you filled up your fridge with food, let it sit there until it goes bad, and then complained that you don’t have anything to eat,” says Ailamaki. The same holds true for software. She explains: “Today most people use only 10% of the data they store. But before data can be stored, they have to be saved onto a server and standardized – and you generally need to know what you want to do with them afterwards.” She offers this example: “Suppose you have a series of interviews saved in different Word files, along with an Excel file listing all the companies that invest in EPFL startups. Imagine you wanted to search all the files and find the names of people who mention sustainability in their interview and who have invested in an EPFL startup. You couldn’t do it. The data would have to be stored in a database for that kind of query. And because the data are stored in two different formats – text and table – you’d have to standardize them before you could save them in the database.”

Ordering data à la carte

Ailamaki’s approach involves formatting the data even before this process begins. “Instead of standardizing the data, our system recognizes what kind they are and gives them a mathematical format based on how they will be searched. Then when it comes time to search the data, our program generates the exact code needed to execute the query – one query at a time,” she says. However, one potential drawback to this approach is that it’s significantly slower than conventional methods where data are already saved in a database and searches are done directly. But Ailamaki’s team found a solution for that, too. “Our system uses artificial intelligence and machine learning to remember the kinds of queries performed. It stores all the work done previously – much like a cache that lets programs respond much more quickly to queries of the same dataset,” she says.

The system, with its just-in-time approach, allows users to search any type of data in any way, combine data from any source and create a cache of the most frequently used data. It’s similar to ordering à la carte. And the system can be used with any kind of data, hardware or application since it doesn’t rely on code, but mathematics. RAW Labs used this approach to develop its RAW technology for combining different types of data on the fly and generating important information in a ready-to-use format for both businesses and consumers.

When it comes to hardware, the DIAS engineers are taking the same approach. Their just-in-time code-generation method can develop a program for mapping out hardware’s properties on the fly and helping organizations run their IT equipment more efficiently. “With our system, organizations can use their computers at up to 80% of their potential,” says Ailamaki.


EPFL gets a new data center

EPFL is building a new data center as part of the upgrade to its thermal power station. The high-density center, which will eventually have 3 megawatts of capacity, will be used to store, manage and process data collected by EPFL scientists during their lab experiments. Its sides and roof will be covered entirely in solar panels, and the heat generated by its servers will be recovered and used in the new power station. It’s scheduled to go into service in the second half of 2021.

Source: EPFL
Read More