Deployments in Omdena AI Innovation Challenges

Deployments in Omdena AI Innovation Challenges

Deployed Solutions Built in Omdena Challenges

From raw data to visualization

Omdena teams spend significant time on cleaning and wrangling data in order to extract valuable insights. Next, our teams build highly contextual dashboards to visualize insights and drive value.

StreamLit Applications

Interactive dashboards deployed on the web with data analysis, map visualizations, and different views. All models and predictions appear live on the website.

Tableau Viz

Visualizing data and models’ predictions on Tableau, with different views and slides.

Mobile Applications

Deploying the models on Android based web applications using Flask, e.g. Showing pathologies offline in Ultrasound images.

PowerBi dashboards

Shows the data collected, the analysis, and different models’ results in an interactive way.

Project-specific dashboards

Omdena dashboards take data visualization to the next level by creating context around each of our projects.

The end product is deployed according to the project circumstances, e.g. in some African countries where the internet can be less stable, we deployed offline mobile applications.

On Cloud

Web applications





Android mobile applications

Chatbots & Translators 

Our Recent Work

In Agriculture, Energy, Education, Infrastructure, and Climate Change.

Deployment & Dashboards

Classifying Rooftops Through Neural Networks to Eliminate Energy Waste 

A Streamlit dashboard visualizing locations of rooftops to install clean energy facilities by entering locations and coordinates. This was done by analyzing satellite images and machine learning.

Annotating a dataset of 2924 articles through doccano. Then the team created a StreamLit web application to detect bias in these articles using NLP. The app allowed QA reviewers to “choose” which segment “wins,” or if the segment should be sent back for re-annotation.

Demo video

Using predictive analytics for designing solar solutions or clean energy solutions for clients based on their projected energy usage/profile.

A simple user interface was developed using Flask to help caseworkers accomplish their work easier and faster. With forms via WTForms and graphs via Plotly, and let Bootstrap handle the overall stylization of the UI.

A Javascript code to implement Google Translate API was incorporated into the HTML templates, enabling the translation of any page within the app into 108 languages.

For the database, we used PostgreSQL, our dataset, excluding the five confidential case files initially provided by the partner, was seeded into the database, which was then hosted on Amazon RDS.

Also, a Tableau dashboard was developed.

Deploying a model with flask that detects pathologies in Ultrasound images, and creating a Docker container so that it can be easily deployed in the cloud, and creating an offline pathology Android mobile app so that it can be used in places without an internet connection like Africa.

The objectives of the project involved taking a more concentrated look into the various parameters and factors that affect the future outcomes of students from various backgrounds. By combining data scraped from college alumni statistics and merging it with information obtained from surveys and annual debt data, analysis was performed to connect the dots and develop a clearer understanding of what makes one profile better and more adaptable than the other.

The demo stems from an Omdena project with Global Partnership for Sustainable Development Data. A team of 30 AI engineers used GEE images and Jupyter to build an app for crop yield prediction in Senegal, Africa, and improve agriculture and food security in the country.

To predict infrastructure needs in Africa, the team used different data science domains to build the datasets needed. They used natural language processing to scrap tweets and get insights on what the news and citizens say about important topics like economics, finance, and education.

By using route planning, they estimated the distances score for essential communities like hospitals. data analysis, and machine learning modeling.

And data analysis and machine learning models, to predict population, water stress index, and electricity access.

Our team successfully built a chatbot and a sentiment analysis model independently. The chatbot learned from its more than 807,000 messages to understand how to parse sentences and structure a proper response. The project successfully employed message sentiment analysis and was able to warn the user of potentially risky conversations initiated by online predators. The sentiment analysis ranged from low, medium, or high levels of risk.

A tableau dashboard that shows different predictive modeling to extract economic well being in India through satellite imagery and Census panel data as ground truth. Collecting data from GEE images and Census surveys like water sources, electricity access, types of households materials, etc. Then the team built deep learning models to predict socio-economic well-being on a district level.

The team built an interactive dashboard visualizing the distribution of the grants after using feature extraction, and scraping different platforms like; Twitter, Google, and 1200 PDF files through automated APIs. The overall approach allowed us to gather data that visualizes several billion dollars of not for profit grant data for further NLP analysis across six countries.

In the time of the pandemics, the news and social media posts showed an increase in online and domestic violence.

In this viz, the team shows geographical data and the models results of NLP textual analysis and topic modeling according to regions and different online platforms.

The dashboard consists of five major sections of the results, where users can navigate across each section using the navigation pull-down menu on the left sidebar, and use other functionalities on the sidebar to select the content they would like to see. The dashboard has several views, land cover view, temperature view, geospatial view, and statistical analysis.

Build, Deploy, and Scale AI Solutions in Eight Weeks