• 4#10 - Geir Myrind - The Revival of Data Modeling (Nor)
    Feb 3 2025

    "Vi modellerer for å forstå, organisere og strukturere dataene." / "We model to understand, organize, and structure the data."

    This episode with Geir Myrind, Chief Information Architect, offers a deep dive into the value of data modeling in organizations. We explore how unified models can enhance the value of data analysis across platforms and discuss the technological development trends that have shaped this field. Historical shifts toward more customized systems have also challenged the way we approach data modeling in public agencies such as the Norwegian Tax Administration.

    Here are my key takeaways:
    Standardization

    • Standardization is a starting point to build a foundation, but not something that let you advance beyond best practice.
    • Use standards to agree on ground rules, that can frame our work, make it interoperable.
    • Conceptual modeling is about understanding a domain, its semantics and key concepts, using standards to ensure consistency and support interoperability.

    Data Modeling

    • Modeling is an important method to bridge business and data.
    • More and more these conceptual models gain relevance for people outside data and IT to understand how things relate.
    • Models make it possible to be understood by both humans and machines.
    • If you are too application focused, data will not reach its potential and you will not be able to utilize data models to their full benefits.
    • This application focus which has been prominent in mainstream IT for many years now is probably the reason why data modeling has lost some of its popularity.
    • Tool advancement and new technology can have an impact on Data Management practices.
    • New tools need a certain data readiness, a foundation to create value, e.g. a good metadata foundation.
    • Data Modeling has often been viewed as a bureaucratic process with little flexibility.
    • Agility in Data Modeling is about modeling being an integrated part of the work - be present, involved, addressed.
    • The information architect and data modeling cannot be a secretary to the development process but needs to be involved as an active part in the cross-functional teams.
    • Information needs to be connected across domains and therefore information modeling should be connected to business architecture and process modeling.
    • Modeling tools are too often connected only to the discipline you are modeling within (e.g. different tools for Data vs. Process Modeling).
    • There is substantial value in understanding what information and data is used in which processes and in what way.
    • The greatest potential is within reusability of data, its semantics and the knowledge it represents.

    The role of Information Architect

    • Information Architects have played a central role for decades.
    • While the role itself is stable it has to face different challenges today.
    • Information is fluctuant and its movement needs to be understood, be it through applications or processes.
    • Whilst modeling is a vital part of the work, Information Architects need to keep a focus on the big picture and the overhauling architecture.
    • Information architects are needed both in projects and within domains.
    • There is a difference between Information and Data Architects. Data Architects focus on the data layer, within the information architecture, much closer to decisions made in IT.
    • The biggest change in skills and competency needs for Information Architects is that they have to navigate a much more complex and interdisciplinary landscape.

    Metadata

    • Data Catalogs typically include components on Metadata Management.
    • We need to define Metadata broader - it includes much more than data about data, but rather data about things.
    Show more Show less
    41 mins
  • 4#9 - Marte Kjelvik & Jørgen Brenne - Healthcare Data Management: Towards Standardization and Integration (Nor)
    Jan 13 2025

    "Den største utfordringen, det viktigste å ta tak i, det er å standardisere på nasjonalt nivå. / The biggest challenge, the most important thing to address, is standardizing at the national level."

    The healthcare industry is undergoing a significant transformation, driven by the need to modernize health registries and create a cohesive approach to data governance. At the heart of this transformation is the ambition to harness the power of data to improve decision-making, streamline processes, and enhance patient outcomes. Jørgen Brenne, as a technical project manager, and Marte Kjelvik’s team, have been instrumental in navigating the complexities of this change. Their insights shed light on the challenges and opportunities inherent in healthcare data modernization.

    Here are my key takeaways:
    Healthcare data and registry

    • Its important to navigate different requirements from different sources of authority.
    • To maintain comprehensive, secure, and well-managed data registries is a challenging task.
    • We need a national standardized language to create a common understanding of health data, what services we offer within healthcare and how they align.
    • Authorities need also to standardize requirements for code and systems.
    • National healthcare data registry needs to be more connected to the healthcare services, to understand data availability and data needs.

    Competency

    • Data Governance and Data Management are the foundational needs the registry has recognized.
    • Dimensional Modeling was one of the first classes, they trained their data team on, to ensure this foundational competency.
    • If the technology you choose supports your methodology, your recruitment of new resources becomes easier, since you don’t need to get experts on that very methodology.

    Models

    • User stories are a focus point and prioritized. Data Lineage (How data changed through different systems) is not the same as Data Provenience (Where is the datas origin). You need both to understand business logic and intent of collection) - User stories can help establish that link.
    • Understanding basic concepts and entities accounts for 80% of the work.
    • Conceptual models ensured to not reflect technical elements.
    • These models should be shareable to be a way to explain your services externally.
    • Could first provides an open basis to work from that can be seen as an opportunity.
    • There are many possibilities to ensure security, availability, and discoverability.
    • Digitalization in Norwegian public services has brought forth a set of common components, that agencies are encouraged to use across public administration.
    • Work based on experiences and exchange with others, while ensuring good documentation of processes.
    • Find standardized ways of building logical models, based on Data Contracts.
    • By using global business keys, you can ensure that you gain structured insight into the data that is transmitted.
    • Low Code tools generate generic code, based on the model to ensure effective distribution and storage of that data in the registry.
    • The logical model needs to capture the data needs of the users.
    • Data Vault 2.0 as a modeling tool to process new dats sources and adhering to a logical structure.
    • There is a discipline reference group established to ensure business alignment and verification of the models.
    • Data should be catalogued as soon as it enters the system to capture the accompanying logic.

    Data Vault

    • Adaptable to change and able to coordinated different sources and methods.
    • It supports change of formats without the need to change code.
    • It makes parallel data processing possible at scale.
    • Yet due to the heterogeneity of data vault, you need some tool to mange.
    Show more Show less
    31 mins
  • Holiday Special: Joe Reis - A Journey around the World of Data (Eng)
    Dec 16 2024

    «Data Management is an interesting one: If it fails, what’s the feedback loop?»

    For the Holiday Special of Season 4, we’ve invited the author of «Fundamentals of Data Engineering», podcast host of the «Joe Reis Show», «Mixed Model Arts» sensei, and «recovering Data Scientist» Joe Reis.
    Joe has been a transformative voice in the field of data engineering and beyond.
    He is also the author of the upcoming book with the working title "Mixed Model Arts", which redefines data modeling for the modern era.

    This episode covers the evolution of data science, its early promise, and its current challenges. Joe reflects on how the role of the data scientist has been misunderstood and diluted, emphasizing the importance of data engineering as a foundational discipline.
    We explore why data modeling—a once-vital skill—has fallen by the wayside and why it must be revived to support today’s complex data ecosystems.
    Joe offers insights into the nuances of real-time systems, the significance of data contracts, and the role of governance in creating accountability and fostering collaboration.

    We also highlight two major book releases: Joe’s "Mixed Model Arts", a guide to modernizing data modeling practices, and our host Winfried Etzel’s book on federated Data Governance, which outlines practical approaches to governing data in fast-evolving decentralized organizations. Together, these works promise to provide actionable solutions to some of the most pressing challenges in data management today.

    Join us for a forward-thinking conversation that challenges conventional wisdom and equips you with insights to start rethinking how data is managed, modeled, and governed in your organization.

    Some key takeaways:

    Make Data Management tangible

    • Data management is not clear enough to be understood, to have feedback loops, to ensure responsibility to understand what good looks like.
    • Because Data Management is not always clear enough, there is a pressure to make it more tangible.
    • That pressure is also applied to Data Governance, through new roles like Data Governance Engineers, DataGovOps, etc.
    • These roles mash enforcing policies with designing policies.

    Data Contracts

    • Shift Left in Data needs to be understood more clearly, towards a closer understanding and collaboration with source systems.
    • Data Contracts are necessary, but it’s no different from interface files in software. It’s about understanding behavior and expectations.
    • Data Contracts are not only about controlling, but also about making issues visible.

    Data Governance

    • Think of Data Governance as political parties. Some might be liberal, some more conservative.
    • We need to make Data Governance lean, integrated and collaborative, while at the same time ensuring oversight and accountability.
    • People need a reason to care about governance rules and held accountable.
    • If not Data Governance «(...) ends up being that committee of waste.»
    • The current way Data Governance is done doesn’t work. It needs a new look.
    • Enforcing rules, that people don’t se ant connection to or ownership within are deemed to fail.
    • We need to view ownership from two perspectives - a legal and a business perspective. They are different.

    Data Modeling

    • Business processes, domains and standards are some of the building blocks for data.
    • Data Modeling should be an intentional act, not something you do on the side.
    • The literature on Data Modeling is old, we are stuck in a table-centric view of the world.
    Show more Show less
    54 mins
  • 4#8 - Shuang Wu - Service Platform: From Analytics to AI-Driven Success (Eng)
    Nov 25 2024

    «We want to make data actionable.»

    Join us for an engaging conversation with Shuang Wu, Mesta's lead data engineer. We delve into the concept of platforms and explore how they empower autonomous delivery teams, making data-driven decisions a central part of their strategy.

    Shuang discusses the intricate process of evolving from a mere data platform to a comprehensive service platform, especially within organizations that aren't IT-centric. Her insights emphasize a lean, agile approach to prioritize use cases, focusing on quick iterations and prototypes that foster self-service and data democratization. We explore the potential shift towards a decentralized data structure where domain teams leverage data more effectively, driving operational changes and tangible business value in their pursuit of efficiency and impact.

    My key learnings:

    • It’s not just about gaining insights, but also about harmonizing and understanding data in context.
    • Find your SMEs and involve them closely - you need insight knowledge about the data and pair that with engineering capabilities.
    • Over time the SMEs and the central data team share experiences and knowledge. This creates a productive ground for working together.
    • The more understanding business users gain on data, the more they want to build themselves.
    • Central team delivers core data assets in a robust and stable manner. Business teams can build on that.

    The Data

    • You can integrate and combine internal data with external sources (like weather data, or road network data) to create valuable insights.
    • Utilizing external data can save you efforts, since it often is structured and API ready.
    • Dont over-engineer solutions - find you what your user-requirements are and provide data that match the requirements, not more.
    • Use an agile approach to prioritize use cases together with your business users.
    • Ensure you have a clear picture of potential value, but also investment and cost.
    • Work in short iterations, to provide value quickly and constantly.
    • Understand your platform constrains and limitations, also related to quality.
    • Find your WHY! Why am I doing the work and what does that mean when it comes to prioritization?
    • What is the value, impact and effort needed?


    Service Platform:

    • Is about offering self-service functionality.
    • Due to the size of Mesta it made sense to take ownership for many data products centrally, closely aligned with the platform.
    • Build it as a foundation, that can give rise to different digitalization initiatives.
    • If you want to make data actionable they need to be discoverable first.
    • The modular approach to data platform allows you to scale up required functionality when needed, but also to scale to zero if not.
    • Verify requirements as early as you can.


    Working with business use cases

    • Visibility and discoverability of data stays a top priority.
    • Make data and AI Literacy use case based, hands-on programs
    • You need to understand constrains when selecting and working with a business use case.
    • Start with a time-bound requirements analysis process, that also analyses constraints within the data.
    • Once data is gathered and available on the platform, business case validity is much easier to verify.
    • Gather the most relevant data first, and then see how you can utilize it further once it is structured accordingly.
    • Quite often ideas originate in the business, and then the central data team is validating if the data can support the use case.


    Show more Show less
    41 mins
  • 4#7 - Victor Undli - From Hype to Innovation: Navigating Data Science and AI in Norway (Eng)
    Nov 4 2024

    «I think we are just seeing the beginning of what we can achieve in that field.»

    Step into the world of data science and AI as we welcome Victor Undli, a leading data scientist from Norway, who shares his insights into how this field has evolved from mere hype to a vital driver of innovation in Norwegian organizations. Discover how Victor's work with Ung.no, a Norwegian platform for teenagers, illustrates the profound social impact and value creation potential of data science, especially when it comes to directing young inquiring minds to the right experts using natural language processing. We'll discuss the challenges that organizations face in adopting data science, particularly the tendency to seek out pre-conceived solutions instead of targeting real issues with the right tools. This episode promises to illuminate how AI can enhance rather than replace human roles by balancing automation with human oversight.

    Join us as we explore the challenges of bridging the gap between academia and industry, with a spotlight on Norway's public sector as a cautious yet progressive player in tech advancement. Victor also shares his thoughts on developing a Norwegian language model that aligns with local values and culture, which could be pivotal as the AI Act comes into play. Learn about the unique role Norway can adopt in the AI landscape by becoming a model for small countries in utilizing large language models ethically and effectively. We highlight the components of successful machine learning projects: quality data, a strong use case, and effective execution, and encourage the power of imagination in idea development, calling on people from all backgrounds to engage.

    Here are my key takeaways:
    Get started as Data Scientist

    • Expectations from working with cutting edge tech, and chasing the last percentage of precision.
    • Reality is much more messy.
    • Time management and choosing ideas carefully is important.
    • «I end up with creating a lot of benchmark models with the time given, and then try to improve them in a later iteration.»
    • Data Science studies is very much about deep diving into models and their performance, almost unconcerned with technical limitations.
    • A lot of tasks when working with Data Science are in fact Data Engineering tasks.
    • Closing the gap between academia and industry is going to be hard.
    • Data Science is a team sport - you want someone to exchange with and work together with.

    Public vs. Privat

    • There is a difference between public and privat sector in Norway.
    • Public sector in Norway is quite advanced in technological development.
    • Public sector acts more carefully.

    Stakeholder Management and Data Quality

    • It is important to communicate clearly and consistently with your stakeholders.
    • You have to compromise between stakeholder expectation and your restrains.
    • If you don’t curate your data correctly, it will loose some of its potential over time.
    • Data Quality is central, especially when used for AI models.
    • Data Curation is also a lot about Data Enrichments - filling in the gaps.

    AI and the need for a Norwegian LLM

    • AI can be categorized into the brain and the imagination.
    • The brain is to understand, the imagination is to create.
    • We should invest time into creating open source, Norwegian LLM, as a competitive choice.
    • Language encapsulates culture. You need to embrace language to understand culture.
    • Norways role is a sa strong consumer of AI. That also means to lead by example.
    • Norway and the Nordic countries can bring a strong ethical focus to the table.


    Show more Show less
    31 mins
  • 4#6 - Rasmus Thornberg - Decision Science and AI between Use Case and Product (Eng)
    Oct 14 2024

    «Focusing on the end-result you want, that is where the journey starts.»

    Curious about how Decision Science can revolutionize your business? Join us as our guest Rasmus Thornberg from Tetra Pak guides us through his journey of transforming complex ideas into tangible, innovative products.

    Aligning AI with business strategies can be a daunting task, especially in conservative industries, but it’s crucial for modern organizations. This episode sheds light on how strategic alignment and adaptability can be game-changers. We dissect the common build-versus-buy dilemma, emphasizing that solutions should focus on value and specific organizational needs. Rasmus's insights bring to life the role of effective communication in bridging the divide between data science and executive decision-making, a vital component in driving meaningful change from the top down.

    Learn how to overcome analysis paralysis and foster a learning culture. By focusing on the genuine value added to users, you can ensure that technological barriers don't stall progress. Rasmus shares how to ensure the products you build align perfectly with user needs, creating a winning formula for business transformation.

    Here are my key takeaways:
    Decision Science

    • You need to understand the cost of error of a ML/AI application
    • Cost of error limits the usability of AI
    • Decision Science is a broader take on Data Science, combining Data Science with Behavioral Science.
    • Decision Science covers cognitive choices that lead to decisions.
    • Decision Science can just work in close proximity to the end user and the product, something that has been a challenge for many.

    From Use Case to product

    • Lots of genAI use cases are about personal efficiency, not to improve any specific organizational target.
    • Differentiating between genAI and analytical AI can help ton understand what the target is.
    • genAI hype has created interest from many. You can use it as a vessel to talk about other things related to AI or even to push Data Governance.
    • When selecting use cases, think about adoption and how it will affect the organization at large.
    • When planning with a use case, find where uncertainties are and ability for outcomes.
    • It’s easy to jump to the HOW, by solving business use cases, but you really need to identify the WHY and WHAT first.
    • Analysis-paralysis is a really problem, when it comes to move from ideation to action, or from PoC to operations.
    • «Assess your impact all the time.»
    • You need to have a feedback loop and concentrate on the decision making, not the outcome.
    • A good decision is based on the information you had available before you made a decision, not the outcome of the decision.
    • A learning culture is a precondition for better decision making.
    • If you correct your actions just one or two steps at a time, you can still go in the wrong direction. Sometimes you need to go back to start and see your entire progress.
    • The need for speed can lead to directional constrains in your development of solutions.
    • Be aware of measurements and metrics becoming the target.
    • When you build a product, you need to set a treshold for when to decommission it.

    Strategic connection

    • The more abstract you get the higher value you can create, but the risk also gets bigger.
    • The biggest value we can gain as companies is to adopt pur business model to new opportunities.
    • The more organizations go into a plug-n-play mode, the less risk, but also less value opportunities.
    • Industrial organizations live in outdated constrains, especially when it comes to cost for decision making.
    • Dont view strategy as a constrain, but rather a direction that can provide flexibility.
    Show more Show less
    39 mins
  • 4#5 - Olga Sergeeva - Data and AI in Modern FMCG Supply Chains (Eng)
    Sep 30 2024

    «We made a transition from being a company that produces a lot of data, to a company which has control over the data we are producing.»

    Unlock the secrets of optimizing supply chains with data and AI through the lens of TINE, Norway's largest milk producer. Our guest, Olga Sergeeva, head of the analytics department at Tine, takes us on her journey from a passion for mathematics to spearheading digital transformation in the fast-moving consumer goods industry.

    Ever wondered how organizations can successfully integrate AI tools into their business processes? This episode dives into the uneven digital maturity across departments and the strategies used to overcome these challenges. We discuss how data visualization tools act as a gateway to AI, making advanced algorithms accessible without needing to grasp the technical nitty-gritty. Olga shares how TINE’s data department empowers users by providing crucial expertise while ensuring they understand the probabilistic nature of AI-generated data.

    Finally, discover how teamwork and a systematic approach can drive data adoption to new heights. From improving milk quality with predictive algorithms to optimizing logistics and production planning, we explore practical AI use cases within Tine's supply chain.

    Here are my takeaways:

    • Mathematics is a combination of beauty, art and structure.
    • Find your way in data and digitalization before jumping on the AI-train.
    • Ensure that people can excel at what they are best at - this is what Tine tries to do for the farmers.
    • Data only has a value, when it can be used - find ways to use data from analytics to prediction to more advanced algorithms.
    • Create a baseline through a maturity assessment to see how you can tailor your work to the different business units.
    • Follow up and monitor the usage of your data tools in the different areas of your business
    • Create a gateway into data for your business users: Once that gateway is established it is also easier to introduce new tools.
    • Data Literacy has a limit - not everyone in the business needs to be a data expert.
    • Yet you need someone you can trust to enable and provide guidance - the Data team.
    • Business users need to understand the difference between concrete answers and probability.


    • How do you transform a complex organization without breaking the culture?
    • Your data/digital/AI transformation team is key in ensuring good transformative action without breaking culture.
    • Ensure you have good ambassadors for your data work in the Business Units, that what to transfer their knowledge in their respective units.
    • Create a network of data-interested people, that help to drive adoption.
    • Engage people by showing an initial value.
    • Offer courses and classes for people to learn and understand more, but also to spread the word about your focus points.
    • Inhouse courses provided by your own staff can increase the confidence in your data team.


    • AI can mean different things to different people. It is important to define AI in your setting.
    • Don’t replace existing work process with AI-driven solutions, just for the sake of it. Find ways to focus on where improvement actually provides business value.
    • When you think of a new AI project, you have several options:
    1. Develop in house
    2. Buy off the shelf
    3. Do nothing
    • Option two should be your preferred solution
    • AI strategy is part of a larger ecosystem, with conditions to adhere to.
    • Data and algorithms should become interconnected, also visually represented.
    • «Always remember your core business.»
    Show more Show less
    39 mins
  • 4#4 - May Lisbeth Øversveen - Data Strategy in Medium-sized organizations (Nor)
    Sep 16 2024

    "Det er vanskelig å komme seg ut av det jeg kaller: et excel-helvete. / It is hard to escape, what I call: Excel-hell."

    Are you wondering how medium-sized companies can handle data strategy and data governance effectively? Join us as we talk to May Lisbeth Øversveen, who has over 23 years of experience in the industry, and shares her expertise from Eidsiva Bredbånd. She provides us with insight into how to work with data maturity and the implementation of data strategy.

    How can mid-sized companies balance resources and create effective data governance strategies? May Lisbeth and I explore this topic in depth. We talk about the importance of involving the business units early in the process in order to create ownership and commitment around the improvement measures.

    Here are my key takeaways:

    • The way we talk about data as a profession has changed, the lingo has changed and we adopt to trends.
    • To display and evaluate data from different sources that are not connected, excel becomes the tool of choice.
    • There is a very calculated amount of resources, that limit your ability to set up substantial teams to work exclusively on eg. Data Governance.
    • Data Governance in SME (Small Medium sized enterprises) can be modeled as a repeatable process that incrementally enhances your data governance maturity.
    • Identify sizable initiatives, ensure that they can be handled with a set amount of resources, and create metrics that enable you to track your progress.
    • You need to find ways to ensure observability and monitoring over time.
    • Don’t create something that you have no resources to maintain and improve going forward.
    • To identify the right initiatives at the right time, you need to ensure a close collaborating with your business users.
    • Ensure transparent and traceable ownership of the initiatives from the business side.
    • To create a movement and engagement in data requires continuous and structured communication.

    Data Maturity Assessment

    • There is a need for speed and agility in SME, to ensure compatibility.
    • Data Maturity Assessments are a welcome introduction to ensure that you create a baseline when working with data.
    • There are advantages to both an internal view and to get some external perspective on your data maturity.
    • Results from a maturity assessment can be a reality check that is not always easy to convey, yet you need to be realistic.
    • Maturity assessments should ideally be both
      • Modeled/tailored to the needs of the organizations in question.
      • Repeatable and comparable over time and across organizations.
      • Good assessments cover both.
    • To initially increase your maturity you can pick different tasks:
      • Low hanging fruits
      • «Duct-taped» operations that you can finally rectify
      • Find known problems that are visible
      • Find pinpoints for your business users
    • It is good to start with cases that are understandable for business users, create interest, and can easy show value to leadership - this is what creates buy-in.
    • You need to ensure that you keep a clear communication towards bigger, more substantial tasks, so your resources are not limited to quick win actions.

    Data Strategy

    • Data Strategy needs to be closely aligned with business strategy.
    • Have a clear vision of where you want to go.
    • To have a structure approach run our data strategy on how to handle both people, process, and technology is important for any work with data.
    • Technology is not the staring point, but rather a consequence of your strategic choices, your organizational setup and your available resources.
    • You need to include well-defined metrics to track progress.
    • Find metrics that are closely connected to business outcome and value creation.
    Show more Show less
    33 mins