The team of more than 30 Omdena AI engineers collaborated with Norwegian cleantech startup Glint Solar to use AI to augment their solar site assessment tool for floating solar panels.
The project goal was to apply remote sensing techniques on satellite imagery to infer the depth of inland water bodies. This information can be added to GlintSolar´s solar site assessment tool to identify suitable sites and accelerate the green energy revolution.
As Global Warming continues to rise, there needs to be a way to tackle this problem. One such method is to install solar panels and harness the energy from the sun.
But due to the low availability of land to install solar panels, an increasing number of installers use inland water bodies and install floating panels utilizing the water surfaces. Using water bodies is especially attractive in places where the availability of land is low. Interesting locations could be drinking water reservoirs, water cleaning facilities, and hydropower reservoirs.
Source: Glint Solar
Bathymetry (depth map) and vertical water level variation over time are essential to evaluate when choosing the best locations for solar installations, as these have a significant impact on the number of panels that can be installed on a given area, as well as the overall cost Bathymetry, can be derived using multispectral satellite images, but all commonly used techniques require calibration data.
Furthermore, today’s techniques are susceptible to noise from varying bottom conditions and particles in the water (such as silt and algae). In inland water bodies, calibration data is seldom available, and the water often has a high degree of particles.
The project outcomes
The three main objectives of the project were as following:
Identifying the preprocessing steps to denoise the data for better model performance,
Building AI models to infer the depth of inland water bodies, and
Integrating the suitable model for the GlintSolar solar site assessment tool
The datasets collected in this challenge include Water Bodies Dataset (Sample_1), Bathybase Dataset, and Global Reservoir Dataset. Additionally, several other available datasets were also identified for GlintSolar to consider further.
Three preprocessing steps were made to denoise the satellite data: a general pipeline that preprocesses raster data from the Bathybase dataset to modeling ready format, a Cloud cover removal process using Sentinel 2 Level 2 images, and Algal Blooms Detection process using MODIS data. While these processes showed excellent results in clearing the noises in the satellite image data, the limitation is that the Algal Bloom Detection process was not integrated into the pipeline due to different image sources; however, a proof of concept was done for further development.
The models and deployment phase
Several models were tested, and the best-performing model was identified.
The deployment of the work put the code into modularized python scripts for production purposes, included all the required dependencies, and stored all the files on a DAGsHub repository. This challenge successfully identified parts of the preprocessing steps to denoise the satellite data and identified a well-performing model to predict the depth of water bodies from satellite images. The current modeling process is based on one lake/water body but can be developed in the future to accept multiple waterbody data for modeling. The result of this challenge provided a preliminary preprocessing and modeling pipeline as a minimal viable product that will be further developed and scaled up for integration into GlintSolar’s solar site assessment tool.
In partnership with Norwegian company Think Outside the team developed a solution to forecast the water availability coming from rivers and snowmelt into reservoir lakes as well as electricity prices to optimize the renewable energy production in Scandinavia.
Think Outside is a Norwegian company currently focused on providing clients with “constant access to accurate and reliable snow data … to make projections you can be confident in”.
They use radar systems to image snowbanks so that they can provide their clients within the hydropower energy industry with data about the density of snowpacks. The hydropower companies can use this data to make better predictions about the future volume of water that will be flowing into reservoirs due to snow melting and from this make more accurate predictions about the amount of energy they will be able to generate from this water as it passes into reservoirs and then through the hydropower electricity turbines.
Think-Outside wished to expand its current data and forecasting offerings so that they are able to provide additional value to its clients in the hydropower industry. The goal of this project was the development of a machine learning pipeline to make predictions for both water inflow into reservoirs/lakes and future electricity prices.
The project outcomes
Due to the different requirements of water inflow and electricity price modeling, the project was divided into two sub-projects: (1) Water inflow prediction and (2) Electricity price prediction. For both sub-projects, we defined the task objectives relating to the project goals specified by Think-Outside, explored numerous data sources, downloaded and processed data (in an automated manner where possible), and then cleaned, explored, and preprocessed the data.
Finally, machine learning models we created and trained on the input datasets and then these models could be used to make predictions about future values of water inflow on a per reservoir/catchment area level and also to predict future values of electricity prices of different energy bidding zones within Norway. The performance of the machine learning models and predictions was analyzed using a number of different error metrics as well as visual representations of the model performances.
Within two months, the team built two risk scoring systems, using Analytic Hierarchy Process (AHP) models for predicting climate and geopolitical risks to help financing institutes make more informed decisions.
Finz focuses to help global decision-makers, such as government entities, banks, or foundations, to finance autonomous and light units for water or energy efficient accesses, in urban and rural contexts. It is now widely accepted that climate change or geopolitical issues pose serious threats to the availability of essential-to-life resources, such as water or energy.
In emerging countries (Africa or Southeast Asia), access to resources for human or animal consumption, and agriculture needs to secure food, are keys. In developed countries, climate change threatens the availability of water (e.g. California) and will figure a big issue for all inhabitants on Earth tomorrow.
We believe to help how decision-makers act and decide with accuracy in predicting risks of damage /destruction on assets or population that could be a catalyst factor to adjust essential-to-life needs with decisions.
The Project Results
More than 40 technology change-makers worked on publicly available datasets, like satellite imagery, land cover data, and news outlets, to Extract relevant data points, Transform these points into useful information sources, and Load these in newly created models to compute Climate Change or Geopolitical driven Risk Scores.
NeedEnergy is an energy-tech startup to provide sustainable and clean energy solutions. In this two-month Omdena Challenge, 50 technology changemakers collaborated to develop predictive models for designing solar rooftop installations and gas pay-as-you-go reticulation services.
Sub-Saharan Africa has over 600 million people without access to electricity and electricity demand grows at an annual growth rate of 11%, the highest rate of any region worldwide. The number grows to over 700 million if clean cooking energy sources are considered as most people still rely on firewood and charcoal for their day-to-day cooking. These are just a few of the many additional challenges:
The Grid is getting old and results in increased maintenance and operation cost.
Cost for unplanned maintenance and unforeseen faults is a pain for utilities and results in loss of revenue.
The Grid has not fully migrated to the edge or cloud to benefit from industry 4.0.
Data is in abundance but most of it is not utilized, a potential to start solving the above mentioned.
Electricity demand for commercial spaces will grow to 390 TWh by 2040 and 70% of this demand will be covered by renewable solar PV energy. This sector will experience one of the biggest energy transitions and an opportunity for a more m modern architecture for the grid of the future.
The project outcomes
NeedEnergy intends to use predictive analytics for designing solar solutions or clean energy solutions for clients based on their projected energy usage/profile. This will help to increase energy adoption where it is most needed.
You will help to accomplish this by leveraging NeedEnergy`s network of smart energy monitors for both electricity and gas. This will help with decision-making for Commercial and Industrial (C&I) clients who are transitioning to renewable energy. The analytics insights will also be used for energy suppliers. For example, gas suppliers can better plan deliveries and inventory based on the data.
In this project, you will also build predictive models to detect anomalies in the operation of the installed solar asset. An integration with IBM Deep Thunder will be ideal so that weather influences on the installation can be put into perspective when designing or operating the solar installation.
For the project, the data is classified into two main buckets, which we will use to varying degrees depending on how the project unfolds:
Historical Data (realized data) – This information contains the highest signal-to-noise ratio and high relevance but is expensive to collect both financially and timely.
User data obtained from smart meters onsight.
Demographic information obtained from public entities like the local utility and Regional Power Trading Data (research paper to be shared)
Forward-Looking Information – This information is used to provide a broader context for prediction purposes and improve accuracy when dealing with new/unseen situations. It takes into account things that may not appear in historical data sets.
The Omdena team built internal databases to store this information (relational and time series) and also develop an API to allow for easy access in production and for research purposes.
Streamlit interactive dashboard showing short-term and long-term energy demand – Source: Omdena
You can view and explore the dashboard using this link. To read more about how the data was collected till how that dashboard was built, please check the articles below.
In this two-month Omdena Challenge, 50 technology changemakers have built a recommender engine to automate and simplify facility energy system upgrades. This will help to reduce energy wastage and carbon dioxide emissions while making buildings more efficient and sustainable.
Existing buildings are the top consumer of energy, the top emitter of carbon dioxide emissions, and it takes 30-50 years for new energy efficiency solutions like LED lighting to fully penetrate our existing buildings One of the biggest challenges with upgrading our existing buildings with energy-efficient technology is identifying cost-effective solutions for individual buildings and asset portfolios.
The current method is both time-consuming and expensive with sales reps to find customers, engineers manually inspecting the building’s existing energy systems, and then designing energy-efficient retrofit solutions, followed by a long sales cycle, low customer conversions, and no long term tracking of the retrofit project’s success.
The main problem this project solved is the manual, time-consuming process to identify, prioritize, and recommend energy-efficient solutions for a particular building and customer. This will result in the building owner spending less on energy, spending less on their building upgrades, and emitting less carbon dioxide. Future iterations of the recommendation engine could also optimize for solutions that improve building occupant health and productivity which has social impacts or renewable energy, microgrid, water conservation, and other environmental solution adoption
The project outcomes
Retrolux has built a platform called Smart Energy scaleOS™ to automate and simplify facility energy system upgrades. You will build a project recommendation engine to the scaleOS™ platform that automatically analyzes individual buildings and asset portfolios to identify, prioritize, and recommend specific energy efficiency and other clean energy solutions that meet the facility owner’s project approval criteria.
The recommendation engine may analyze the building’s physical and energy characteristics, local and regional environment, utility tariffs and incentives, government regulations and incentives, current energy technology costs and performance specifications, and other available information.
The recommendation engine can then identify specific energy-efficient and clean energy solutions for individual buildings, provide means and methods to prioritize the solutions, and ultimately recommend cost-effective solutions for building owner approval based on the owner’s minimum return on investment, carbon reduction, or other goals.
Our goal for this project is a robust overall recommendation engine architecture built specifically for at least one energy efficiency solution like LED lighting, lighting controls, rooftop unit optimization controls, smart motors, super-efficient warehouse freezer doors, etc.