Data and Analytics
Get in touch with us to find out more about the approach and framework, we have developed for setting up a proper Data Governance in place, which is embraced by the company.
Is your company able to utilise and leverage all the benefits from the data you generate, possess, and acquire? Can you generate actionable insights, integrated into your custom processes, so that you can gain a competitive advantage? Let us understand your business challenges so that we can help you become a more efficient and effective organisation.
Some organisations have already realised the hidden potential for achieving competitive advantage by turning their data into actionable insights. Do you want to be a front-runner, or would you rather play catch-up? Get in touch with us and discover more.
Siloed data, residing in fragmented legacy systems and local databases. Sound familiar?
A coherent data strategy is essential for the success of any forward-looking company. This will link your business objectives with the appropriate data and technology enablers and create solid data foundations for achieving your strategic goals.
Getting ahead of the competition means having the information needed for making the right decisions readily available. Rapid changes in the current volatile, uncertain, complex and ambiguous environment may force you into short-term shifts in your value proposition.
Our strategic framework is developed to cover all aspects of your organisation and to deliver an actionable roadmap, which you can execute in order to become insights-driven by leveraging cloud platforms, agile advanced analytics and building a transformative data culture.
We provide data architecture services for cloud-based, on-prem hosted or hybrid technologies. As part of this, we design solutions, allowing you to drive business value from your data in an efficient, secure and cost-effective way. With our experience of structured, as well as unstructured data types and use cases related to the modernisation of complex legacy IT and data landscapes, we achieve impressive results throughout the whole data lifecycle. You can trust us in ensuring the right data modelling, application of standards and definition of processes.
How do you create a best-in-class data architecture? Depending on your business case and current technical architecture, we can define the data architecture tailored according to your business needs, covering all your functional and non-functional requirements.
We have created architectures that can cover multiple data sources, formats and types, including internal ERP data, CRM data, other downstream systems’ data, sensors stream data, as well as external data.
Our Data Engineers are highly skilled and experienced in creating data pipelines for various data sources. We master the extract, transform and load (ETL/ELT) processes and we are up to date with the latest tools and technologies. Depending on your requirements and needs, we can leverage native AWS platform services, such as AWS Glue, AWS Data Pipeline, Apache Airflow and Redshift, to speed up data ingestion and transformation processes with a robust and scalable solution, by reducing development and maintenance costs. Alternatively, we can build the required pipelines from scratch.
Data & Analytics Platform
We work as partners and advisors to your business. Our experience and knowledge of designing and building data & analytics platform solutions from the ground up using AWS cloud native solutions allow you to save development and maintenance costs. We apply the right mixture of custom code, Infrastructure-as-Code and automated deployments to create reliable, scalable and maintainable solutions.
We can help you transform your data into insightful, easy-to-digest and actionable dashboards. Depending on your functional and non-functional requirements, our BI experts and visualisation engineers chose the most optimal modern tools to deliver impactful dashboards and reports tailored to the various needs of your business stakeholders. Depending on the ask, we can, for example, use AWS native services such as Amazon QuickSight to get you moving quickly and cost-effectively with no client install required.
Is your organisation able to derive value from its data?
Proper Data Governance is crucial for the success of any digital transformation. Often the hidden value of properly established Data Governance is not fully recognised by the leadership within organisations. In many cases, Data Governance is seen as bureaucracy dressed in policies, procedures and guidance. This makes most governance programs ineffective.
Data & Analytics Case Studies
Stock market data tool integrated with Bloomberg
Overview: Our client, a financial services company, approached us with a challenge to improve their data analysis tools and provide automation to their team's daily work. They wanted a solution that would help their surveillance organization gain more insights into potential scenarios. Our approach was to create a solution that would enable them to predict MTD/YTD prices.
Approach: We worked in close collaboration with the client to analyze multiple numerical methods and compose different types of feature vectors to improve the prediction model's accuracy. We developed a script that extracts information and an algorithm that calculates the analysis of historical price changes.
We integrated multiple data sources into the solution, including the Bloomberg service terminal, to ensure we had the most accurate and up-to-date data. We developed an API and a user-friendly interface for filtering and searching the information we extracted and analyzed. We used Keras to predict MTD / YTD prices developing a stock prediction model with a neural network to predict the returns on stocks.
Tech stack: Python, Django,Vue.js, AWS, Keras.
Results: The performance of our solution was excellent, and we optimized it to achieve more than 80% accuracy in predicting prices. As a result, the tool became an essential asset for our client, empowering their surveillance organization to understand market trends better and predict future prices with greater accuracy.
Image recognition and NLP for fraud detection
Challenge: Creating a fully functioning solution from scratch
Team: Data Engineers, Data Scientists, ML Engineers, DevOps
Solution: The project started with the investigation phase to analyse the existing data and test multiple solutions which meet the needs of both client and cloud providers. AWS was selected as a core service provider for data storage combined with ElasticSearch. Our team built the full ETL process to start operations. Once the data structure and pipelines were set, we introduced two teams of Data Scientists - one specialized in semantic analysis (NLP) and the other with experience in image recognition. The ML part was further taken care of by our engineers. The output was a dashboard of data visualisations made in Kibana to present the results.
Status: Successful completion of the data engineering phase with ongoing data science stages.
Results: The project significantly improved the client's ability to detect security breaches behaviour, which led to a reduction in risk.
Traffic analysis and prediction based on AI
Challenge: The client`s request involved road and traffic analysis. The task included building a vehicle counting system based on vehicle type and classification, as well as differentiating between various types of vehicles and conducting a total count.
Approach: We started by developing a custom video surveillance software which made it possible to remotely watch the traffic situation and collect data. Utilizing image recognition and data analytics techniques we developed a POC model analysing the traffic data, which was further extended to a fully functioning tool for road and safety as the traffic analysis.
Through advanced deep learning our team created a solution which was able to detect and alert for potential accidents, predict congestion, or plan efficient roads and parking spaces. The solution is currently growing into autonomously monitoring traffic cameras and systems and generating real-time alerts when certain events of interest occur.
Tech stack: Python, React.JS, Yolo, SSD and OpenCV
Results: The solution enables easy supervision of key crossings, risk spots and motorways with the goal to reduce and prevent accidents.
The solution provides remote supervision of the movement of motor vehicles on streets and motorway crossings which helps in easy monitoring of on-location visits.
AI model certification for cardiovascular medicine
Overview: Our client is a European MedTech start-up focused on the detection of cardiovascular anomalies. The start-up’s objective is to challenge the market for ECG signal-based detection of heart pathologies, such as, but not limited to, atrial fibrillation, premature beats, subventricular tachycardia, etc, obtained from affordable wearable devices such as the Polar H10. The company required an ML-based model development and preparation for the Class 2A medical device certification, in order to ensure that their system is safe and effective for patients to use at home.
Team: Our team of experts included a Data Engineer, Data Scientists, ML Engineers and a Project Manager
Our Data Scientists collaborated with the client in order to understand their existing ECG signal processing system, as well as their computation-based algorithms. We created an automated testing tool that assessed the performance of the current solution against a range of internal and external medical data sets. Based on this analysis, we provided recommendations to improve the scope of detectable medical conditions and identified areas for improvement.
We advised the client on the certification process and prepared an action plan for the ML-related work and documentation needed for the Class 2A certification. We also provided improvement recommendations and execution support, which helped the client enhance their ECG signal processing system and algorithms. Our team completed the analysis and recommendations in less than 8 weeks, providing actionable advice, which helped the client reduce the application processing time.
ML-based signal processing model development phase
The client provided the raw ECG signal data, including labelled data with annotations of R-peaks, pathologies, as well as ‘soft’ and ‘hard’ noise segments. This labelled data was used as inputs for the models and deemed as their training set. The quality and accuracy of pathology predictions (detection) heavily depend on the cleaness of the ECG signal. This is why it is important to have robust, well-performing models in terms of accuracy and precision, which can distinguish between the clean and noisy parts of the ECG signals. Our Data Scientists developed ML-based models for ‘soft’ and ‘hard’ noise detection. Upon successfully passing the strict model performance acceptance criteria of the client, our ML Engineers created the production pipelines for training and inference and integrated it into the client’s platform and business processes. Our collaboration continues with the development of further pathology detection ML-based algorithms.
Results: Through our active collaboration with the client's team, we were able to complete the model certification preparation work in just 6 months, helping the client achieve their goal of obtaining a Class 2A certification for their AI-powered medical device system. This certification allows the client to offer their product to a wider range of customers, enabling them to provide early detection and prevention of heart-related medical conditions.
IoT tool for electricity consumption analysis
Overview: Our client needed a tool to analyze electricity consumption based on the data from sensors integrated into electrical sockets, in real-time and historically, with the ability to track any critical moments. The client also required the tool to be scalable and applicable to other sensors in the market.
Solution: To ensure that the tool met the requirements, our team collaborated with the client's team to understand their needs and preferences. We analyzed the data collected by the sensors to develop algorithms to accurately track the electricity consumption of each socket. We then integrated the algorithms into a user-friendly interface that allowed users to easily track their electricity usage over time. We used Python, Django and Angular to develop the backend and front end of the tool, respectively, ensuring a scalable solution that could be easily extended to support other sensors in the market.
Our team also implemented various features to enhance the user experience. We included notifications and snapshots for critical moments, such as extreme power consumption, electrical fault or a sudden surge in electricity usage. We also generated heat maps for each socket, providing users with visual information about their electricity usage over time. We ensured that the tool was responsive and could be used on mobile devices, enabling users to monitor their electricity usage on the go.
Tech stack: Python, Django, Angular, AWS.
Outcome: Our electricity consumption analysis tool provided the client with an innovative solution that accurately tracked the electricity consumption in real-time and historically. The tool's user-friendly interface, along with its ability to generate heat maps and send notifications, made it easy for users to track their electricity usage and identify areas where they could reduce consumption. The tool is successfully used in facility management and manufacturing.
Let's work together