Category: data

  • Executive Summary: Application Lifecycle in EAM

    #architecture #clarity #velocity #direction

    Das Application Lifecycle Management (ALM) in LeanIX ist ein zentraler Bestandteil des Enterprise Architecture Managements (EAM). Es ermöglicht Unternehmen, den gesamten Lebenszyklus ihrer Anwendungen effektiv zu verwalten und zu optimieren. Dieser Prozess umfasst alle Phasen von der Planung und Entwicklung über den Betrieb bis hin zur Ablösung von Applikationen.

    LeanIX bietet als EAM-Tool umfangreiche Funktionen, um Application Owner bei der Verwaltung ihrer Anwendungen zu unterstützen. Es ermöglicht eine ganzheitliche Sicht auf die IT-Landschaft und hilft dabei, Abhängigkeiten, Risiken und Optimierungspotenziale zu identifizieren.

    In diesem Blog werden wir zunächst die Bedeutung des ALM für Application Owner erläutern und anschließend konkrete Verbesserungsvorschläge für die Umsetzung in LeanIX präsentieren. Ziel ist es, die Effizienz und Effektivität des Application Lifecycle Managements zu steigern und somit einen größeren Mehrwert für das Unternehmen zu schaffen.

    Sensibilisierung der Application Owner

    Um Application Owner, die mehrere Applikationen verantworten und der Meinung sind, dass sie LeanIX nicht benötigen, von der Wichtigkeit von EAM im allgemeines und des Tools im besonderen zu überzeugen, können folgende Testfragen mit Fokus auf Architektur, Prozesse und Daten gestellt werden:

    a) Architektur-bezogene Fragen:

    • Wie schnell können Sie herausfinden, welche Ihrer Applikationen von einer geplanten Infrastrukturänderung betroffen wären?
    • Welche Ihrer Applikationen nutzen veraltete Technologien und müssen in naher Zukunft modernisiert werden?

    b) Prozess-bezogene Fragen:

    • Wie würden Sie den Einfluss einer Ihrer Applikationen auf die gesamte Wertschöpfungskette des Unternehmens beschreiben?
    • Bei einem Ausfall einer Ihrer Applikationen: Wie schnell können Sie alle betroffenen Geschäftsprozesse identifizieren?

    c) Daten-bezogene Fragen:

    • Können Sie für jede Ihrer Applikationen die verarbeiteten Datenentitäten und deren Datenflüsse skizzieren?
    • Können Sie ad hoc angeben, welche Ihrer Applikationen personenbezogene Daten verarbeiten und wie diese geschützt werden?

    d) Übergreifende Fragen:

    • Wie schnell können Sie bei einer Audit-Anfrage alle relevanten Informationen zu Ihren Applikationen zusammenstellen?
    • Wie stellen Sie sicher, dass alle Stakeholder stets über den aktuellen Stand und geplante Änderungen Ihrer Applikationen informiert sind?

    Verbesserungsvorschläge für Application Lifecycle Management in LeanIX

    Um Application Owner bei der Pflege ihrer Applikationen in LeanIX zu unterstützen, die Unternehmensarchitektur stärker am Business auszurichten und den Zusammenhang zum Datenmanagement zu nutzen, schlage ich folgende konkrete Aktivitäten als Diskussionsgrundlage vor:

    1. Schulungen und Workshops für Application Owner:
      • Organisieren Sie regelmäßige Schulungen zu LeanIX und Best Practices
      • Führen Sie Workshops durch, die den Zusammenhang zwischen Applikationen, Geschäftsprozessen und Daten verdeutlichen
      • Erstellen Sie praxisnahe Leitfäden und Checklisten für die Pflege von Applikationen in LeanIX in einem leicht zugänglichen Werkzeug wie z. B. Confluence
      • Erstellen Sie LeanIX-Surveys, über die Application Owner relevante Informationen einfach durch Beantwortung zugeschnittener Fragenkataloge vornehmen können
    2. Prozessorientierte Modellierung in LeanIX:
      • Implementieren Sie eine prozessorientierte Sicht in LeanIX
      • Verknüpfen Sie Applikationen mit den unterstützten Geschäftsprozessen
      • Visualisieren Sie den Beitrag jeder Applikation zur Wertschöpfungskette
    3. Integration von Datenmanagement-Aspekten:
      • Erweitern Sie das LeanIX-Metamodell um relevante Datenmanagement-Attribute
      • Verknüpfen Sie Applikationen mit den von ihnen verarbeiteten Datenentitäten
      • Implementieren Sie Datenflussdiagramme, die den Zusammenhang zwischen Applikationen und Daten zeigen
    4. Automatisierung und Integration:
      • Implementieren Sie Schnittstellen zwischen LeanIX und anderen relevanten Tools (z.B. BPM, Data Management Platform)
      • Automatisieren Sie die Aktualisierung von Basis-Informationen in LeanIX
      • Erstellen Sie Dashboards, die den Pflegestatus und die Datenqualität visualisieren
    5. Governance und Anreize:
      • Etablieren Sie klare Verantwortlichkeiten und SLAs für die Pflege von Applikationsinformationen
      • Implementieren Sie ein Belohnungssystem für Application Owner, die ihre Daten aktuell halten
      • Führen Sie regelmäßige Reviews der Applikationslandschaft durch
    6. Daten-Governance Integration:
      • Verknüpfen Sie Daten-Governance-Rollen (z.B. Data Owner, Data Steward) mit den entsprechenden Applikationen in LeanIX
      • Implementieren Sie Attribute für Datenklassifizierung und Datenschutzanforderungen bei Applikationen
      • Erstellen Sie Reports, die Daten-Governance-Aspekte über die gesamte Applikationslandschaft hinweg zeigen
    7. Kontinuierliche Verbesserung:
      • Etablieren Sie einen regelmäßigen Feedback-Prozess mit Application Ownern
      • Analysieren Sie Nutzungsmuster in LeanIX, um Verbesserungspotenziale zu identifizieren
      • Passen Sie das Metamodell und die Prozesse basierend auf dem Feedback kontinuierlich an
  • The Power of Feature Models

    #feature #modeling #tracing #portfolio

    In the following, we will have a look where feature models can help improving management of software development.

    Example for mapping features to apps

    Definition

    In software development, a feature model is a compact representation of all the products of the Software Product Line (SPL) in terms of “features”.

    source: Wikipedia

    There are many more sources on the subject, but this simple one will suffice here.

    Manage Development using Feature Models

    In software development features are mostly implemented by code. Your development process like e.g. Scrum typically focuses on people, communication, self-organizing teams, and a running system among other things. A Scrum team sprints its own way from stories to running code. A typical question popping up looking at the big picture is:

    “How can development be managed across teams and products?”

    Cascaded agile working models like SAFe and LeSS (Scrum of Scrums) argue that architecture plays an important role and at the same time needs to be aligned with the code. How can you scale architecture from product code to product portfolio?

    Scaling Architecture from Products to Portfolio

    Imagine you need to report a KPI for sales YTD based on weekly and daily sales data from various sales apps having different sales models. Three different app teams might be involved probably using different technologies and documentation. How do we get to a common denominator helping to organize development?

    First, let’s understand the business logic regardless of technologies. The feature “KPI sales YTD” itself is agnostic of sources delivering the raw data. It provides a unified concept into which some magic transforms the feature “Timeline of sales numbers” from sources B2C and B2B.

    Having identified those features we can now organize development. The app “Sales Information System” is responsible for calculating the KPI sales YTD while the apps “Sales System B2C” and “Sales System B2B” each manage timelines of sales numbers. Development effort can now be partitioned for the teams and dependencies are known too.

    The central idea here is to find a common concept for both products and portfolio. For those managing across products and teams features are used as basic units while app teams refine those to their specific needs.

    If you need more precision e.g. in the case that the aggregation of the timelines needs to be done in several steps like export timeline, enrich timeline, and sum up timeline you can cascade features. There are good approaches refining a feature model using e.g. business functions or business data or a combination of both.

    Conclusion

    Breaking products down into features has a lot of benefits

    • Speak a common language (portfolio and products)
    • Avoid double work (deduplicate feature implementation)
    • Avoid overall waste (streamlined feature catalogue)
    • Make progress transparent (plan per feature)
    • Ease analysis (impact per feature)
    • Clearly specify changes (story per feature change)

    Feature models support improvement of software development especially in case of self-organizing product teams.

  • Executive Summary: Data Strategy 2.0

    #architecture #clarity #velocity #direction 

    In my last post Executive Summary: Strategic Data Science, I have summarized what Data Science is and what it consist of. Moreover, you need to deploy a strategy that helps you manage transformation to a data-driven business.

    Today, you will see that a strategy for data science can be handled just like any data strategy. And if you already have a data strategy deployed, e.g. as part of your governance or architecture initiative, then you will see why and where it is affected.

    As written in Executive Summary on EA Maturity, having a map knowing where you are and where you want to go to helps a lot in finding a way.

    Maturity

    If you are working with maturity models, you typically do this on a yearly basis. For chosen capabilities you identify current vs target maturity e.g. ranked from level 1 to 5.

    The first thing you need to understand is that introducing data science for the first time reduces your overall maturity at once. Why is that?

    Maturity is measured in terms of capabilities. And if you take a look into those capabilities you will find that you need to adapt them. There typically are a dozen or so like vision, objectives, people, processes, policies, master data management, business intelligence, big data analytics, data quality, data modeling, data asset planning, data integration, and metadata management.

    I will pick only a few as examples to make things clear. Let’s pick vision, people, and technology.

    Selected Capabilities for Explaining Maturity of Data Strategy

    Vision

    Say you have a vision like: “Providing customer care that is so satisfying, that every customer comes back to us with a smile”. That’s a very strong statement, but how about: “Keeping every customer satisfied by solving all problems before complaining”. Wow, even stronger. It is possible because Data Science allows you to predict what others can’t.

    People

    Probably, you already have a data architect. But, the classic data architect focuses on architecture, technology, and governance issues. This is OK, but you also need some data advisor focusing on unseen solutions for the business. Someone telling you to combine customer data with product usage data increasing your sales. And perhaps even telling you from which of your precious data you can create completely new data-driven products you can sell.

    Technology

    Probably, you also have an inventory telling you which data sources are used in your applications. Adding Data Science as rapidly growing discipline to the equation, you may find that you will have to revise your technology portfolio. It is rapidly growing and changing and, therefore, needs to be governed to a certain amount (freedom vs standardization).

    Following list shows selected technologies that are most often used in Data Science (ranked from left to right).

    • Programming Languages: SQL, Python, R
    • Relational Databases: MySQL, MS SQL Server, PostgreSQL
    • Big data platforms: Spark, Hive, MongoDB
    • Spreadsheets, BI, Reporting: Excel, Power BI, QlikView

    Moreover, there is a shift in who is actually using these technologies like Leadership, Finance, Sales, and Marketing. And more often without dedicated enterprise applications because data analysis is very dynamic and has a lot of try and error to it.

    Conclusion

    From these view capabilities out of a dozen+ it has become clear that Data Science Strategy easily fits into an overall Data Strategy. There is no need to reinvent the wheel. Instead, adapt your existing or favorite Data Strategy to incorparate Data Science.

  • Executive Summary: Strategic Data Science

    #architecture #clarity #velocity #direction #data

    If you as C-level are already using or plan to use data science you probably pursue the goal to increase your market share by making predictions that others can’t. You might think that there is no need for strategic management of data science. Actually, that’s as far from the truth as it can get. But, why is that? It is because there may be a lot of complexity indicated by the figure below and discussed in the following.

    The Flower of Complexity

    Definition

    First, let’s take a look into the definition

    Data science is an inter-disciplinary field that uses scientific methods, processes, algorithms and systems to extract knowledge and insights from many structural and unstructured data.

    source: wikipedia

    There are a lot of keywords in this rather short definition that should raise your eyebrows: inter-disciplinary, methods, processes, algorithms, systems, many.

    Basic Method

    Now, let’s pick a keyword from above and dig deeper e.g. recall the basic scientific method:

    1. Find a question
    2. Collect data
    3. Prepare data for analysis
    4. Create model
    5. Evaluate model
    6. Deploy model

    Doesn’t sound overly complex, but let’s finally deep dive. Which of those phases do you think is responsible for most of the effort spent? It is the step that roughly amounts to 80% of the overall process! There are even several synonyms for it like data munging, data wrangling, and data cleaning or cleansing. You guessed right, it is phase three. Its complexity is mainly driven by the number of different data sources, the number and complexity of involved data structures, and sometimes also mixed with unstructured data.

    Conclusion

    We can go on like this for a while, but I do not want to bore you with the details. So, let’s summarize first and I will deliver a compressed list of further aspects afterward which you may take note of or skip altogether.

    Forecast:
    If you do not strategically manage data science in your enterprise you may expect another area of proliferation which you should urgently avoid!

    Solution:
    I can help you with that. My approach is to combine data science with an architecture development cycle. Proven methods and tools will help you to master the inherent complexity and get the most out of data science for your business. You can leave the details to me.

    The Details

    Data science as a discipline delivers methods like the one we have discussed above. Yet, it also

    • combines subjects like
      • computer science
      • math & statistics
      • business domain knowledge
    • involves interdisciplinary roles like
      • Data Engineer
      • Data Scientist
      • Business Analyst
      • Product Owner / Project Manager
      • Developer
      • User Interface Specialist
    • implies many skills like
      • programming
      • working with data
      • descriptive statistics
      • data visualization
      • statistical modeling
      • handling Big Data
      • machine learning
      • deploying to production
    • is done with many tools like
      (only top 3-4 in each category named here)
      • programming languages
        • SQL
        • Python
        • R
      • databases
        • MySQL
        • MS SQL Server
        • PostgreSQL
        • Oracle
      • Big data platforms
        • Spark
        • Hive
        • MongoDB
        • Amazon Redshift
      • Spreadsheets, BI, Reporting
        • Excel
        • Power BI
        • QlikView

    And the list is growing steadily. A little exhausting, isn’t it? At this point latest you should be convinced that data science needs strategic attention.