The value in your untapped data to support data transformation

About this event

CHASEY DAVIES-WRIGLEY : Welcome everyone to this webinar about the practical potential of the extensive data reservoirs that exist within the UK public sector.
My name is Chasey Davies-Wrigley, and I am a Principal Data Engineer at Made Tech. I am a technology leader with a highly technical background including an MSc in Computer Science. I have over 22 years of experience across both private and public sectors.
I have been responsible for distributed and multi-disciplinary teams; building, architecting, delivering and maintaining highly reliable and scalable systems across a whole variety of platforms and technologies.
Here at Made Tech I am focused on empowering the public sector to deliver and continuously improve digital services that are user-centric, data driven and free from legacy technology.
Our agenda today will see us taking a look at all of this untapped data value and how we find it. We’ll take a look at the ways we can assess our current state, and then we will explore just where we might want to be.
Then we’ll take a look at ways to get there; the potential challenges that we might be faced with, before diving into a hypothetical case concerning a health department. Finally, I will wrap up with some key takeaways.
Okay then, let’s make a start. So, what do we mean by ‘untapped data value’? Each department within the public sector generates and stores a substantial amount of data. The real question is, are we optimising its value?
Let’s start by understanding our existing data landscape. It’s a mix of rich historical data and lots of real-time in-flowing data and a real variety of data sources. All of these are not necessarily optimised for any meaningful insights.
We are in an age of data proliferation. Over the decades, the UK public sector has expanded its data collection exponentially. This trend reflects the rise of digitalisation and the increasing recognition of data’s importance. We have huge diversity in our data sources. Our public sector isn’t just a governance body. It is a treasure trove of latent insights from healthcare records to traffic patterns. The data we have amassed has real transformative potential if we can just tap into it.
However, there is a wide data utilisation spectrum. Not all departments are at the same stage in terms of data utilisation. Some, due to earlier investments in analytics or pressing needs, have progressed further than others. Recognising this spectrum helps us target our efforts where they are needed most.
We need to identify the value potential. A significant portion of our data remains under-utilised. This isn’t just a missed technical opportunity but is a missed chance to enhance public service delivery, policy formation and a lot more. Realising this potential should be an important aim and this has been recognised in the government’s national data strategy.
But there are challenges. The public sector isn’t a monolithic organisation. It is different departments, and they each face different and unique challenges from technical limitations to privacy concerns. I’ll discuss these in more detail as we go.
So, how do we find this potential value? We need to acknowledge that the age of merely collecting and storing data is behind us. Now, it’s about extracting relevant insights, focusing on quality and fostering an environment where data informs decision-making.
There are limitations of traditional metrics. Data maturity assessments, while beneficial, primarily offer a linear view. They track our progression in data management and usage, but they often don’t capture the nuances or the broader eco-system’s interconnectedness.
Maturity models may not always factor in the unique challenges, or the diverse nature of data within the public sector.
We need a holistic data utilisation approach. Data isn’t just about collection, storage and periodic analysis. It is for creating a continuously adaptive, data-informed culture. This means recognising data’s potential to influence policy decisions, improve public service delivery, predict future trends and foster innovation.
By looking beyond maturity assessments, we can exploit data’s full spectrum of capabilities. This means prioritising outcomes over processes.
For instance, a department might achieve a high maturity rating but may not necessarily harness its data to make a tangible difference in citizens’ lives. Our ultimate goal is to ensure that data drives real-world beneficial change for UK citizens.
We want to be as ready as we can be for the future. The data landscape is ever evolving. New technologies, methodologies and challenges emerge regularly. By transcending traditional maturity metrics, we position ourselves to be more adaptive and responsive to future shifts, ensuring the public sector remains at the forefront of data-driven innovation.
Another area of focus should be inclusivity and collaboration. Data maturity can sometimes be department centric. The future of public sector data lies in collaboration, sharing insights, methodologies and best practices across departments.
By looking beyond maturity, we foster an environment where data becomes a collaborative tool, breaking down silos and enhancing inter-departmental synergy.
In essence, we need a paradigm shift from viewing data as a checkbox activity measured by maturity levels, to seeing it as an invaluable asset that when harnessed fully, can drive transformational change across the public sector.
How do we go about this paradigm shift? You could start by choosing a framework that encapsulates a roadmap for data utilisation transformation and involves the following key activities:
Diagnostic Analysis
It’s important to first get a clear understanding of our current data practices. Are we merely collecting data, or are we analysing it effectively? How integrated are our data sources across departments? We need to identify where we stand in terms of data collection, utilisation and integration.
Benchmarking
Comparing our current data practices against leading standards and identifying gaps. How do we measure up against leading practices both domestically and internationally? This helps us spot our strengths and areas that need attention.
Setting Objectives
We need a clear vision. Clearly defining what we hope to achieve in terms of data extraction, analysis and application. This means setting smart, specific, measurable, achievable, relevant and time-bound objectives that will guide our efforts.
Strategy Formulation
Once we know where we are and where we want to be, we can chart the course, outlining concrete steps to move from the current state to the future state. This might involve procuring new tools, enhancing collaboration across departments or investing in training for our staff.
Implementation and Iteration
The journey doesn’t end when we put our plans into action. The data landscape is ever evolving, and so must our strategies be. This calls for continuous assessment, learning and refinement of our methods. This means putting plans into action and continuously reassessing and refining our approaches based on the actual outcomes.
With this framework we are not just aiming for short-term wins. We are building a foundation for sustained excellence and utilisation for the public sector.

Let’s take a deep dive into each of those key activities then. Before we can strategise, let’s evaluate. What data strategies are we using? How are we analysing this data, and how does it influence our decisions? Performing a data reservoir audit before we harness our data means we must know its breadth and depth. This entails identifying every data source, whether it is a frequently used CRM system or a lesser-known departmental database. Knowing where our data lies is the first step in unlocking its potential.
Carrying out an analysis proficiency. Understanding the proficiency of our teams is vital. This doesn’t just mean tool proficiency, but also the ability to draw actionable insights from the data. It’s about gauging both technical and analytical skills currently present.
Measuring the impact our current data is making. Data for the sake of data serves little purpose. We need to assess how our current data practices influence policy making, decision driving and service provision. Are we truly data-informed or are we missing out on key insights?
A thorough look at our data governance and compliance. In the realm of public data, trust is paramount. We need a thorough understanding of our governance policies, ensuring they are compliant with all legal and ethical standards, ensuring the integrity, privacy and security of all our data assets.
Then there is collaboration and integration. Data in silos is a lost opportunity. By understanding how well different departments share and integrate data, we can identify collaboration gaps and potential synergies. By understanding our strengths and areas of improvement, we can chart a more informed path forward.
So, where do we want to be? We want to aim for data-informed decision-making. As we move forward, data will become the keystone of our decision-making processes. We are not just talking a about backend metrics. It’s about integrating insights directly into policy formulation and strategic initiatives.
This approach ensures our decisions are rooted in factual, real-time intelligence, optimising outcomes for all stakeholders.
Personalised public services. One size does not fit all. With the power of analytics and AI, we can shift from generic services to those tailored to individual citizens’ needs. Imagine healthcare that considers a patient’s entire history, or urban planning that considers local residents’ feedback and behaviour patterns.
With predictive analysis, instead of being reactive, our goal is to be proactive. With advanced analytics, we can foresee potential challenges, be it an emergent healthcare crisis or urban infrastructure needs. By anticipating these issues, we can allocate resources more efficiently and take timely actions, potentially averting crises. But we are going to need inter-departmental synergy.
The future isn’t about isolated data pulls but rather, an interconnected ecosystem. By breaking down data silos and fostering collaboration between departments, we harness the power of collective intelligence. This integrated approach will lead to more comprehensive insights and more holistic solutions.
All of this will help create an empowered workforce. The potential of data isn’t just about external services. It is also about enhancing our internal capabilities. We envisage a future where our workforce across all levels is equipped with cutting edge tools and the skills to use them, where continuous learning and adaptation become the norm, ensuring our teams are always ahead of the curve.
In essence, we want a future where data isn’t just an operational tool but the central driving force behind a more efficient, effective and citizen-centric UK public sector.
How do we get there? To harness data’s transformative potential, we must first ensure we have a strong and scalable foundation. We need to build a robust infrastructure. This means investing in state-of-the-art data storage, management and processing tools. It’s not just about quantity but quality. Our systems should be capable of handling diverse data types, ensuring data integrity and facilitating seamless access while ensuring security.
The best tools are only as good as the hands that wield them. Our public sector workforce must be equipped with the skills to navigate the evolving data landscape. This involves ongoing training sessions, workshops and courses tailored to various roles, from data novices to seasoned analysts.
By fostering a culture of continuous learning, we ensure our teams remain agile and adaptive. Data’s potential magnifies when viewed as a collective asset. We should aim to foster a collaborative ecosystem where departments share insights, tools and best practices.
By leveraging shared platforms and interdisciplinary teams, we can tackle complex challenges more holistically, ensuring the best minds are always at the table. The journey to data excellence is not a one-off project. It’s an ongoing process. By establishing regular feedback loops with both internal teams and the public, we ensure our approaches remain relevant and effective. Whether it is refining a data model or improving a public service, feedback becomes our compass, guiding us towards better outcomes.
As we delve deeper into data utilisation, we must ensure that our practices are transparent, ethical and aligned with privacy regulations. This means implementing robust data governance frameworks, ensuring informed consent and regularly reviewing our methodologies to align with evolving ethical standards.
The roadmap to a data-driven future isn’t just about tools and tech, it’s a comprehensive approach that interweaves infrastructure, people, collaboration, feedback and ethics. It’s a journey we undertake with clarity, commitment and a collective spirit.
Before we embark on the journey, it’s crucial to understand where we stand and where we wish to be. This involves a thorough analysis of our existing data practices, infrastructure, skills and outcomes. By identifying gaps, we can prioritise areas needing immediate attention, and allocate resources effectively.
Once we have identified the gaps, the next step is crafting a strategic roadmap. This isn’t just a timeline, but a comprehensive plan detailing initiatives, KPIs, responsibilities and milestones. The roadmap acts as our navigational chart, ensuring all departments and stakeholders have clarity on the direction and their roles in the journey.
Change, especially on a large scale, can be daunting. That’s why I would recommend starting with pilot initiatives. These are smaller, controlled projects that allow us to test new methodologies, tools or strategies. Pilots provide invaluable insights, highlight potential roadblocks and give a glimpse of the potential benefits, thus building momentum for larger scale changes.
Successful pilots though shouldn’t remain isolated successes. The next phase involves scaling these initiatives to broader departments, or the entire public sector. Alongside scaling, integration is key. This means ensuring that new methodologies or tools seamlessly integrate with existing systems, fostering synergy rather than discord.
Of course, we need continuous review and iteration. The world of data is dynamic. What is effective today might be obsolete tomorrow, therefore our journey isn’t linear but cyclical. Continuous reviews, feedback collection and iterations ensure we remain on the right path, adjusting our sails as the winds of change blow.
Transitioning to a data driven future is not a leap but a deliberate phased progression. By understanding our current state, crafting a roadmap, testing our strategies, scaling successes and continuously refining our approach, we can bridge the gap between today’s challenges and tomorrow’s potential.
Let’s revisit the first step to driving value from data – understanding what you have. This means creating a comprehensive inventory or catalogue of all data assets across the public sector.
Cataloguing isn’t merely about listing data sets, but documenting their sources, the last date they were updated, the team responsible for them, and all associated metadata. Tools like data catalogues or data management platforms can aid this process.
Once we know what data we possess, we need to assess its quality. This means identifying any gaps, errors or inconsistencies that could impair its utility. Techniques like profiling, which gives an overview of our data quality issues. Different departments or sources might have varying formats or units for similar data. For instance, dates might be formatted differently, or measurement units might vary. This can create challenges in analysis. Implementing a consistent standard ensures that data from multiple sources can be integrated seamlessly, and automated processes can assist in maintaining this consistency.
Redundant or duplicated data not only consumes unnecessary storage, but it can also skew analysis. Identifying and removing such redundancies is important. De-duplication tools or algorithms can scan datasets for repeated entries, ensuring that each piece of data is unique and relevant.
Data storage and cleaning isn’t a one-off task. As new data gets added and older data gets updated, maintaining quality becomes an ongoing commitment. Implementing tools that monitor data quality in real time and flagging inconsistencies or errors as they arise, can ensure that the public sector’s assets remain reliable and actionable. Data discovery and cleaning form the bedrock of all subsequent data initiatives. Without a clear understanding of what data assets are available and then ensuring they are of the highest quality, efforts in analytics, machine learning or any data-driven decision-making might be compromised.
Investing time and resources in these foundational steps ensures that every subsequent layer built on this foundation is robust and reliable.
In the diverse public sector landscape, it is crucial that we are all speaking the same language when it comes to data. Establishing a unified data lexicon ensures clarity and reduces potential misunderstandings. This could involve creating glossaries, documentation or even training modules, to ensure terms, metrics and methodologies are consistent across departments.
You could have cross-departmental workshops. One department’s challenge could be another’s success story. By organising workshops that bring together teams from different sectors, we foster an environment of shared learning. These sessions could be centred around problem solving, brainstorming or even demonstrations of successful data initiatives, providing valuable insights and stirring innovation.
A shared vision isn’t just about collaboration between departments but ensuring that stakeholders from all levels from the frontline workers through to the top tier management are all aligned. This involves regular communication around goals, progress and benefits. Demonstrating early wins can significantly aid in getting buy-in, creating momentum for larger scale initiatives.
Collaboration isn’t a one-off event though, but a continual dialogue. Establishing dedicated channels, be it digital platforms, regular meetings or even suggestion boxes, ensures that feedback flows seamlessly.
This dialogue fosters a sense of ownership and involvement, ensuring that potential issues are flagged early, and innovations are continually shared.
It is important to celebrate collaborative milestones. Recognising and celebrating collective achievements plays a pivotal role in fostering collaboration. Whether it is a successful cross-departmental project or achieving a data-related milestone. Such celebrations highlight the value of working together. Beyond just morale, it reinforces the idea that in the journey of data transformation, collective victories far outweigh isolated successes.
Collaboration and a shared vision are not just strategic imperatives, but the very lifeblood of a successful data-driven transformation. By speaking the same language, sharing learnings, aligning stakeholders, maintaining open dialogue and celebrating collective achievements, we pave the way for a holistic, synergistic evolution.
It’s also important to have transparent data collection. In the realm of public sector, transparency isn’t just a best practice, it is a mandate. Every time data is collected, its purpose, source and application should be clear. By effectively communicating why we are gathering specific data and how it will be used, we foster an environment of trust and understanding. Public consultations and open forums can really help in this transparency.
But with the increasing volume of data, it is important to safeguard the personal and sensitive information of our citizens. Data should be anonymised wherever possible, and stringent encryption should be in place. Additionally, we must abide by legal frameworks like GDPR, ensuring that the data is handled, stored and deleted in compliance with established regulations.
As data-driven decisions become integral, it is important to ensure that these decisions are free from biases. Whether it is machine learning models or data collection methodologies, actively seeking and eliminating biases ensures fairness. Regular audits, third-party reviews and inclusive data teams can play a crucial role in mitigating potential biases in our data processes.
Also there are governance frameworks. To maintain ethical standards, governance frameworks are essential. These frameworks establish clear rules regarding data access, usage, sharing and disposal. By establishing rules and responsibilities, setting up data committees and conducting regular reviews, we ensure accountability at every stage of the data lifecycle. Ultimately, all of these measures culminate in one primary goal: upholding the trust of our citizens. Our responsibility is to ensure that the public’s data is used to enhance their welfare and is not misused or mishandled.
Regular public reports, open data initiatives and feedback channels can play a significant role in fostering and maintaining this trust.
As we embark on this data-driven journey, our moral compass must be as strong as our technical prowess. By ensuring transparent, private, unbiased governed and trust-centric practices, we stand by our commitment to serve the public ethically and responsibly.
We also have to be aware of data silos. Within large organisations, especially the public sector, different departments often store and manage their data independently. These are known as data silos. While these silos can cater to departments’ specific needs, they pose challenges when attempting to get a holistic view or when cross-departmental analysis is required. So, overcoming these silos is the first step towards successful data integration.
Instead of disparate systems across departments, leveraging unified data across all platforms ensures all data is available from a central location. This doesn’t mean all data is stored in one place, but that it can be accessed and analysed cohesively. Solutions like data warehouses or data lakes, especially those on cloud platforms, can enable such unified access, ensuring that data from various departments can be queried together seamlessly.
Data standardisation is crucial for integration. This means ensuring that every department adheres to agreed formats, terminologies and units. Not merely a technical undertaking but often requires collaboration and consensus building across departments. Establishing a central governance body can really help in laying down and enforcing these standards.
Beyond the technical aspects, successful data integration requires a culture shift. Departments need to view data not as their individual asset, but as a collective resource. Regular interdepartmental meetings, workshops and shared projects can foster this sense of collective ownership and vision. Encouraging a culture where data is freely but responsibly shared can break down barriers and improve integration.
While integration aims to make data more accessible, it is essential to strike a balance with data protection and privacy. Not all data should be accessible to all individuals or departments. Clear government policies and role-based access controls and audit trails ensure that while data is integrated and available, it is also protected and used responsibly.
Integrating data across departments is both a technical and a cultural journey. It requires robust platforms, standards and tools. Equally it demands collaboration, trust and a shared vision among the departments. When executed well, it can transform isolated data assets into a powerful, cohesive resource that can drive better decision-making across the entire public sector, but it is going to require commitment from leadership.
The most formative shifts in organisational culture often stem from the top. Leadership must not only support data initiatives but actively champion them. This means being vocal about the value of data, dedicating resources to data initiatives and leading by example. For the public sector this could manifest in regular data strategy reviews at executive meetings, visual participation in data workshops or even public talks emphasising the role of data in enhancing public services.
Continuous learning. The data landscape is continuously evolving. Ensuring everyone stays informed and skilled is crucial. Consider establishing regular workshops and training programmes focused on data literacy and advanced data techniques. Partnerships with universities or data-focused institutions can also be beneficial, bringing in external expertise and fresh perspectives.
It is not enough to preach the value of data. Teams need to have the right tools and resources to act on it. Invest in modern data platforms and tools, ensuring that they are both accessible and user-friendly. Also ensure that there is a support structure in place, perhaps a dedicated data team or helpdesk to assist other departments in their data endeavours.
It also requires open communication. Building a data-ready culture involves breaking down any barriers to communication. Every individual should feel comfortable raising questions, offering feedback or suggesting new data-driven initiatives. Regular forums or town hall meetings where data projects are discussed can be valuable. Such platforms encourage dialogue, helping address concerns and fostering a sense of collective ownership of the data journey.
Recognising and celebrating data can boost morale and further embed the value of data within the organisational psyche. Whether it is a successful project outcome, an innovative data-driven solution or even an individual accomplishment in data training, taking the time to highlight and applaud these successes, such recognitions, whether it is through awards or internal communications reinforce the value of data and motivate continued excellence.
Building a data-ready culture in the public sector is both an endeavour and an opportunity. It is about creating an environment where data isn’t just a tool but a fundamental part of the decision-making process, leading to enhanced public services and informed strategies.
Many departments often stop at basic descriptive analytics, which merely represents what has already happened. The true power lies in predictive analytics, where we anticipate future trends and patterns. For instance, by analysing historical data on public service usage during certain times of the year, we can anticipate future demands and allocate resources more efficiently.
Machine learning and artificial intelligence are tools that can help the public sector identify patterns that might not be apparent to human analysts. Consider traffic management. Using (Unclear) we can predict traffic congestion based on various factors, leading to better urban planning and public transport scheduling.
There are also custom analytics solutions. You can get off-the-shelf analytical tools which are useful, but the unique challenges of the public sector may require customised solutions. Whether it is in public health, transportation or public safety, tailored tools can provide more accurate and relevant insights.
Collaborating with data scientists and engineers to develop these solutions ensures that we are not just leveraging generic tools, but tools fine tuned for our specific data sets and challenges.
In today’s fast-paced world, waiting for end-of-month reports isn’t always feasible. Real-time analytics allows for decisions to be made on the fly, based on the most current data available. An example might be in an emergency response. Real-time data on an unfolding situation can help our emergency services respond more effectively, potentially even saving lives.
Advanced analytics is not a one-time activity. Setting up feedback loops means that the insights derived from the data are continually used to refine processes which in turn produce new data to be analysed. This iterative process ensures that our analytical models are continuously updated, refined and improved upon. It fosters a culture of ongoing learning and optimisation.
It is important to stress that advanced analytics, when leveraged correctly, can be a game-changer. It provides the tools to not only understand our current situation but to anticipate future challenges and opportunities. It allows for proactive rather than reactive decision-making, ensuring the public sector remains agile, efficient and effective in serving the public.
Let’s take a look at potential challenges.
There are data privacy concerns. In the age of GDPR and heightened public awareness about data privacy, ensuring the confidentiality of collective data is paramount. A possible solution could be to implement robust encryption to safeguard the data. Use of anonymisation techniques to remove personally identifiable information ensures that the data analysis doesn’t compromise individual privacy.
Regularly review and update data handling policies to stay compliant with evolving regulations.
There is also a problem of data silos across departments. As already discussed, it is common for different departments within the public sector to operate in silos, leading to fragmented and isolated data pulls.
A possible solution – the government’s national data strategy seeks to address this by promoting cross-departmental data sharing. It advocates fostering a culture of collaboration, encouraging departments to work together, share insights and align their data strategies.
Then there is the problem of quality and consistency of data. Inconsistent or low-quality data can lead to inaccurate insights, undermining the value of any data-driven initiative. One way to alleviate this would be via regular data audits to identify and rectify inconsistencies. Establish standard operating procedures for data collection ensuring uniformity across all departments and implement regular automated data cleaning processes to maintain the integrity of datasets.
Then there are the skills gaps in engineering science, data and analysis. Not every department may have the expertise to build the platforms needed or extract meaningful insights from their data. However, they could invest in continuous training programmes, equipping staff with the latest data tools and techniques. They could also collaborate with external experts or data consultants where needed. Tapping into their specialist knowledge to bridge any gaps and mentor their employees.
Then of course there is the problem of resistance to change. Transitioning to a data-centric approach can be met with resistance. Particularly from those accustomed to traditional methods. It is vital to engage all stakeholders early on, explaining the benefits and value of a data-driven approach.
Demonstrate quick wins or early successes to build confidence and show tangible results. Provide support and resources to ease the transition, ensuring everyone feels equipped and empowered to embrace the new approach.
While the journey to unlock the value of untapped data may present challenges, with proactive strategies and a solutions-oriented mind set, these hurdles can be effectively addressed. The potential rewards in terms of improved efficiency, public service and insights far outweigh the challenges.
Let’s think about a hypothetical case, let’s say a health department. One of the ongoing challenges faced by many health departments today is the increasing wait times for patients, impacting patients’ satisfaction and potentially even health outcomes.
A possible goal could be to understand the root causes, and devise strategies to address them.
When identifying the data sources, we could employ a comprehensive data driven approach using electronic health records that provide a wealth of data about patient visits, times and treatments, along with patient feedback surveys to gain insights into patients’ experiences and potential areas of dissatisfaction. We could also use staff rotas and schedules to help us understand human resource allocation and its impact on wait times.
With these data sources, time series analysis could be conducted to understand trends in patterns in wait times across different hours of the day, days of the week or even seasons. Additionally, bottleneck identification processes could be used by mapping out the patients’ journey from entry to exit and measuring times at each stage. The department could pinpoint where delays were most significant. Was it at reception, during diagnostics, waiting for a particular treatment?
Then a possible outcome could be that armed with these insights, the health department could implement several changes. For instance, if delays were found at certain times in the day due to a rush of patients, additional staff could be scheduled during those periods. If a particular diagnostic machine was identified as a bottleneck, efforts could be made to optimise its usage or invest in additional equipment.
The result could be a tangible reduction in patient wait times, leading to improved patient satisfaction and crucially, better health outcomes as patients receive more timely care.
However, the success of this initiative would not mean the end of the analytical approach. The department would need to set up continuous monitoring tools to keep an eye on the wait times and other metrics, ensuring that any future challenges could be swiftly identified and addressed.
Patient feedback must be continuously solicited and integrated, ensuring the solutions remained patient centric.
Our hypothetical health department case shows how a methodical, data-driven approach can transform a significant challenge into an opportunity for improvement. It illustrates that with the right data, analytical tools and a commitment to continuous improvement, even complex public sector challenges can be effectively addressed.
Let’s have a look at the key takeaways.
As we’ve discussed, our data repositories are like mines, brimming with potential insights. We have only scratched the surface. The real challenge and opportunity lie in digging deeper, asking the right questions and seeking transformative answers that can reshape the ways we can serve the public.
The power of data is magnified when combined with collaborative efforts. The public sector’s strength lies in its diverse departments and teams. By fostering cross-departmental collaborations and data sharing we can harness our collective knowledge for the greater good.
Data maturity isn’t a destination but an ongoing journey. As technology evolves and our capabilities grow, we must continuously strive to elevate our data strategies, ensuring we remain at the cutting edge of public service innovation.
At the heart of our data endeavours is our commitment to the public. Every insight drawn, every decision made is directed towards enhancing public services, ensuring transparency and creating a more connected, informed and resilient society.
As for next steps, as we move forward it is essential to carry the momentum from today’s discussion. Let’s challenge ourselves to prioritise data in our strategies, invest in building our skills and capabilities and continually seek opportunities to transform our data into actionable insights. In our data lies the power to redefine, reimagine and revolutionise the public services we offer, if we embrace this data-driven future.
This webinar is just one in Made Tech’s regular series all about data, called The Pipeline. Details on how you can subscribe are on the Events section of our website. Please check it out if you get a chance. Thank you for your time today. If you would like to continue the conversation or find out more about the framework that Made Tech offers, please feel free to reach out to one of us.
On screen are the contact details but equally there is a contact section on the Made Tech website.

Got a tricky data question you’re struggling to find the answer to? Tune in as Chasey and other speakers from our data webinar series answer questions to your top data questions.


How great is the hidden potential of your data transformation

We know the public sector collects and stores a substantial amount of data, but there are recurrent challenges to optimise its value, that vary from technical issues to privacy concerns.

Join our Principal Data Engineer Chasey Davies-Wrigley as she shares why finding opportunities in your organisation’s data is more than just an exercise in data maturity assessments. Chasey will help you to look at where you are now, including how to find your data value, all the way to where your data transformation could be, with tips on how to get there. 


There’s a lot of public-sector knowledge about data locked away inside different organisations. And much like data itself, we want to open it up. 

This webinar is part of The Pipeline, a collection of talks by our data community breaking down data concepts, exploring data challenges and sharing the lessons we’ve learned from our years in the public sector.


Date

Wednesday, 20 September 2023

Speakers

Chasey Davies-Wrigley

Principal Data Engineer at Made Tech

Chasey Davies-Wrigley is a technology leader with a highly technical background, including an MSc in Computer Science and over 20 years industry experience. At Made Tech, Chasey is focused on empowering the public sector to deliver and continuously improve digital services that are user-centric, data-driven and free from legacy technology.

Read more about Data and AI

Think your sector has data issues? You’re not alone

Think your industry’s data challenges are unique? Think again. Explore how data-driven insights and AI transformation are revolutionising sectors like healthcare, energy, and public safety.

Read more

Top 3 ways to address the challenges and demands in public safety services

How can public safety agencies exploit the power of data and technology? James West explores how to tackle ‘numb pain’ and advocates keeping things simple.

Read more