diff --git a/.gitignore b/.gitignore index e85e8bf..a81f487 100644 --- a/.gitignore +++ b/.gitignore @@ -1,2 +1,3 @@ **/*.drawio.bkp .talismanrc +**/.DS_Store diff --git a/.order b/.order index f724cc2..41c0361 100644 --- a/.order +++ b/.order @@ -2,10 +2,16 @@ README welcome getting-started agile-working +traceability-concept documentation-guidelines +stages versioning -code-review-process -network +branching-code-review-process +requirements-gathering-interview service-catalogue +network +vms-and-lxcs +know-how/git-commands +know-how/sap-tricks faq CHANGELOG diff --git a/README.md b/README.md index 2eaa68e..d24c9c8 100644 --- a/README.md +++ b/README.md @@ -2,6 +2,19 @@ This space is for the engineering team to share knowledge, resources, and best practices. +## Conventions + +### Naming conventions + +As we are heavily using Microsoft Azure and Azure DevOps, we are following the naming conventions provided by Microsoft: + +- [Abbreviation recommendations for Azure resources](https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/azure-best-practices/resource-abbreviations) + +Furthermore, we are using the [AzureNamingTool](https://github.com/mspnp/AzureNamingTool): + +- [AzureNamingTool Deployment Instructions](https://xwr.visualstudio.com/jambor.pro/_git/app-azure-naming) +- [AzureNamingTool Development Instance](https://app-azurenamingtool-dev-bnhfgbdgafeqh2gf.switzerlandnorth-01.azurewebsites.net) + ## Sructure of the repository We are trying to keep the repositories small and structured. The following overview shows the general structure and the most important repositories. diff --git a/agile-working.md b/agile-working.md index 1045b8c..38e2818 100644 --- a/agile-working.md +++ b/agile-working.md @@ -32,6 +32,7 @@ We based our setup on the CMMI process template. The following work items and st - Each team member works on tasks, selected independently from the "Ready" area. - Tasks in the "Done" column of each state should first undergo a peer review within the team. - Only tasks in the "Closed" column are considered fully completed. +- It is best practice to not have more than 2 work items in progress at the same time. Tools: Azure Boards diff --git a/branching-code-review-process.md b/branching-code-review-process.md new file mode 100644 index 0000000..290e937 --- /dev/null +++ b/branching-code-review-process.md @@ -0,0 +1,57 @@ +# Branching and Code Review Process + +This document outlines the guidelines and expectations for effective code branching, review, and merging within a regulated Azure DevOps environment. Emphasis is placed on the four-eyes principle and traceability of changes. + +![Branching and Code Review Process.](resources/diagrams/branching-code-review-process.png "Branching and Code Review Process.") + +## Traceability + +- Utilize Azure Boards for work item tracking, linking commits and PRs to their respective items. +- Enforce policies in Azure Repos to require reviews, approvals, optionally linked work items for merges. +- Use annotations in your code for significant changes, providing explanations and linking back to documentation or specs. + +## Create a Branch + +- Create branches under subfolders aligned with their purpose (`new`, `update`, `fix`, `delete`) following the semantic versioning linked with release notes documentation. +- Use descriptive branch names that reflect the feature or fix. + +## Apply Changes + +- Commit changes regularly with clear, concise messages that follow semantic versioning. +- Include work item or issue numbers in commit messages for traceability. + +## Manual Testing + +- Perform local linting to ensure code quality. +- Preview changes locally, especially for UI-related updates. +- Run unit tests and ensure they pass before pushing code. + +## Create a Pull Request + +- Rebase your branch on the latest main branch to simplify the review process and prevent conflicts. +- Ensure your PR description is detailed, linking back to any related work items or issues for context. +- Tag relevant team members for review and set a deadline if necessary. + +## Automated Testing + +- Optional part, automated tests can be defined within a pipeline that is triggered automnatcally when a Pull Request is created. +- Ensure all automated tests pass, including unit, integration, and end-to-end tests. +- Check for code coverage and ensure it meets the project's standards. + +## Review a Pull Request + +- Focus on maintainability, readability, and scalability of the code. +- Ensure new code adheres to project standards and integrates well with existing codebase. +- Provide constructive feedback, highlighting both strengths and areas for improvement. + +## Merge a Pull Request + +- Ensure all PR checks pass (build, test pipelines, and any additional checks your team requires). +- Only merge after at least one other team member has approved the changes. +- Upon merging, the main branch triggers a CI/CD pipeline to deploy changes to a staging environment for further testing. + +## Post-Merge + +- Delete the feature branch to keep the repository clean and manageable. +- Update any project tracking tools to reflect the completion of tasks. +- Review deployment results and ensure no issues in the staging environment before promoting to production. diff --git a/code-review-process.md b/code-review-process.md deleted file mode 100644 index a113296..0000000 --- a/code-review-process.md +++ /dev/null @@ -1,3 +0,0 @@ -# Code Review Process - -Guidelines and expectations for code review. diff --git a/documentation-guidelines.md b/documentation-guidelines.md index f87a749..07d26c0 100644 --- a/documentation-guidelines.md +++ b/documentation-guidelines.md @@ -6,6 +6,22 @@ Best practices and guidelines for writing code documentation. All documentation should be easy maintain and accessible. Easy formats should be preffered over more complex ones. +### Markdown + +Markdown is the easiest format to write and read. It is recommended to use markdown for all documentation as long as more complex formatting is not needed. + +### AsciiDoc + +AsciiDoc allows more formatting and can create more official looking documents. + +It is recommended to write a pipeline template to easily convert AsciiDoc files to PDFs. Thre we can also have an AsciiDoc template specifiying the look of the PDF. The pipeline template then makes it easy to consume the pdf creation functionality. + +To manually create a PDF from an AsciiDoc file, you need asciidoctor and use the following command: + +```bash +asciidoctor-pdf -a pdf-theme=my-theme.yml example.adoc +``` + ## PlantUML Create png images from PlantUML files using the following command: @@ -31,3 +47,29 @@ Create png images from Draw.io diagrams using the following command. ```bash drawio -x -f png -b 10 -o .png .drawio ``` + +## Folder Structure + +If you want to add an optional folder structure to your documentation, you can use the following command to generate a tree-like structure. + +```bash +git ls-tree -r --name-only HEAD | sed 's|[^/]*| &|g' +``` + +Example output: + +```text + Dockerfile + maus_helper.sh + maus_loader.yml + requirements.txt + setup.py + src/ __init__.py + src/ logging_config/ __init__.py + src/ logging_config/ config.py + src/ maus_loader.py + src/ scraper/ __init__.py + src/ scraper/ scraper.py + tests/ __init__.py + tests/ test_scraper.py +``` diff --git a/getting-started.md b/getting-started.md index 23c597a..44921da 100644 --- a/getting-started.md +++ b/getting-started.md @@ -26,3 +26,28 @@ And then in each repository set up individual configuration: git config user.name "FIRST_NAME LAST_NAME" git config user.email "ABBREVIATION@ypsomed.com" ``` + +## Setup ssh keys + +First, create a dedicated ssh key. Ensure to create seperated keys for different Azure DevOps organizations, for example for development, test or production environments and for sure for different customers. + +```bash +export KEY_NAME="key-azdoxwr-ssh" # choose your name here +ssh-keygen -t rsa-sha2-512 -f ~/.ssh/$KEY_NAME -N "" +``` + +Update your local `~/.ssh/config` file. The example below includes a customer example `yps` and the XWare example. The XWare example is special as we have very old Azure DevOps instance which used to have an old naming scheme. Thats why we hav the different host names. + +```text +Host ssh.dev.azure.com + HostName ssh.dev.azure.com + User git + IdentityFile ~/.ssh/key-azdoyps-ssh + IdentitiesOnly yes + +Host vs-ssh.visualstudio.com + HostName vs-ssh.visualstudio.com + User git + IdentityFile ~/.ssh/key-azdoxwr-ssh + IdentitiesOnly yes +``` diff --git a/know-how/git-commands.md b/know-how/git-commands.md new file mode 100644 index 0000000..c3d6a89 --- /dev/null +++ b/know-how/git-commands.md @@ -0,0 +1,144 @@ +# Git Commands + +You can also review [Oh Shit, Git!?!](https://ohshitgit.com/), for some good solutions to comon git problems. + +## Random Ideas + +[Kart: DVC for geospatial and tabular data. Git for GIS](https://kartproject.org/), [Discussion](https://news.ycombinator.com/item?id=38073512#git), [Go to Post from 2023-10-30T20:40:06](https://social.lansky.name/@hn50/111325898767760054) + +[Use KeePassXC to sign your Git commits](https://code.mendhak.com/keepassxc-sign-git-commit-with-ssh/) + +## Git Commands and Examples + +Create a new branch locally within an already cloned repository: + +```bash +git checkout -b +``` + +Delete a local branch: + +```bash +git branch -d +``` + +Rebase onto the main branch: + +```bash +git fetch +git rebase origin/main +git push origin HEAD -f +``` + +Abort a rebase: + +```bash +git rebase --abort +``` + +Stash changes: + +```bash +git stash +git stash pop +``` + +Revert a commit: + +```bash +# Should work if only 1 commit was made +git reset --soft HEAD~1 + +# More forceful approach: +git revert cb7cf15b54ff09495201244b070d18d96d4703ce +git reset --hard HEAD~2 +``` + +Show changes between two tags: + +```bash +# Tag from previous version +git tag -a v0.1.0 -m "Release version 0.1.0" + +# Add changes +git commit -am "add hint for change log" +# and more + +# Add final tag for version +git tag -a v0.2.0 -m "Release version 0.2.0" + +# Show diff between tags +git log v0.1.0..v0.2.0 --no-merges --format="%h - %s" --date=short +``` + +Git diff log between commits: + +```bash +git log 79e28d9cef4cc777afc9e5b2569a5d34d9331867..6888fd61ae9d5744effcf27620a645e1750cbafc --no-merges --format="%h - %s (%an, %ad)" --date=short +``` + +Debug SSH connection via Git: + +```bash +GIT_SSH_COMMAND="ssh -v" +git pull +unset GIT_SSH_COMMAND +``` + +Add executable flag on Windows: + +```bash +git update-index --chmod=+x git_mirror.sh +``` + +Backup a repository from a source with all branches and tags. + +```bash +# fresh clone +git clone --mirror + +# existing local clone update +git fetch --all --tags --prune + +#Optional: zip it +tar czf repo-backup.git.tar.gz repo.git +``` + +Restore a repository to a new destination with all branches and tags. + +```bash +cd repo.git +git remote -v +git remote set-url origin +git remote -v + +git push --mirror +``` + +Renaming branches: + +```bash +# Delete remote branch +git push origin --delete wikiMaster + +# Delete local branch +git branch -d wikiMaster + +# Move branch +git branch -m main wikiMaster + +# Push +git push origin HEAD +``` + +Wipe a whole git repository, removing all branches, tags, and history, leaving behind only a single, empty main branch. + +```bash +git checkout --orphan main && \ +git rm -rf . && \ +git commit --allow-empty -m "Initialize empty main branch" --no-verify && \ +git push origin main --force && \ +git for-each-ref --format='%(refname:short)' refs/heads/ | grep -v '^main$' | xargs -I{} git push origin --delete {} && \ +git tag -l | xargs -n 1 git push --delete origin +``` + diff --git a/know-how/sap-tricks.md b/know-how/sap-tricks.md new file mode 100644 index 0000000..ca36a63 --- /dev/null +++ b/know-how/sap-tricks.md @@ -0,0 +1,138 @@ +# SAP Tricks + +Navigation mit Fenstern im Transaktionsfeld: + +`/N` = Wechsel +`/O` = Neue Session + +## Tabellen + +Ansicht via SE16N: + +- T001W - Werke +- T024E - EKORG +- TVKO - VKORG +- PROJ - Projekte +- T179 - Produkthierarchie + +## Transaktionen + +- Projekte: CJ20N +- SQ00: Start Queries (haben manchmal Leute, die alle anderen SQ** nicht haben) +- SQ01: Queries pflegen +- SQ02: Infoset pflegen +- SQ03: Benutzergruppen pflegen +- SE38: ABAP Editor +- SA38: ABAP ausführen +- SU01: Benutzer pflegen +- SU01D: Benutzer anzeigen +- /UI2/FLP: Fiori Launchpad +- /SCWM/LAGP – Lagerplätze Welche Lagerplätze gibt es an einem Lagerort +- /SCWM/BINMAT – Lagerplatzartikel Welche Artikel sind auf einem Lagerplatz + +## SAP Darstellung + +- Dark Theme = Quartz Dark Theme +- Old School = Blue Crystal Theme + +![SAP Dark Theme](../resources/images/sap-tricks-theme-1.png) + +![SAP Dark Theme](../resources/images/sap-tricks-theme-2.png) + +## Text markieren + +So kann man zum Beispiel mehrere Zellen markieren, geht aber nur auf einem Bildschirm ohne Scrollen. + +```text +CTRL+Y +``` + +## SE16N + +Wenn keine Berechtigung, dann mal mit "ZSE16N" versuchen zu starten. Gemäss Alex Friess soll das dasselbe sein, nur ohne Änderungsberechtigung. + +Feldlängen anzeigen via "Technische Sicht": + +![Feldlängen anzeigen](../resources/images/sap-tricks-se16n-1.jpg) + +Varianten speichern oder holen: + +![Varianten speichern oder holen](../resources/images/sap-tricks-se16n-2.png) + +Technische Felder in Listenausgabe: + +![Technische Felder in Listenausgabe](../resources/images/sap-tricks-se16n-3.png) +![Technische Felder in Listenausgabe](../resources/images/sap-tricks-se16n-4.png) + +## SQ01/SQ02 Export in Excel + +Ein Excel-Export funktioniert über diesen Knopf nicht in jedem Fall. Hier geht es aber immer: + +![SQ01/SQ02 Export in Excel](../resources/images/sap-tricks-sq0102-1.png) + +Oder der Button, wenn vorhanden: + +![SQ01/SQ02 Export in Excel](../resources/images/sap-tricks-sq0102-2.png) + +## SQ02 InfoSet: Berechnete Spalte + +Kann mittels ABAP erreicht werden. + +![SQ02 InfoSet Berechnete Spalte](../resources/images/sap-tricks-sq02-1.png) + +Typ und Länge vom Feld am besten bei einem anderen ähnlichen Feld abgucken in der SE16N. + +CHAR = C + +![SQ02 InfoSet Berechnete Spalte](../resources/images/sap-tricks-sq02-2.png) + +Den neuen Eintrag dann selektieren und Coding zum Feld anlegen. + +![SQ02 InfoSet Berechnete Spalte](../resources/images/sap-tricks-sq02-3.png) + +Beispiel-Code: + +```ABAP +DATA: lv_vbund TYPE vbund. + +CLEAR: lv_vbund. + +CHECK: kna1-kunnr IS NOT INITIAL OR lfa1-lifnr IS NOT INITIAL. + +IF kna1-kunnr IS NOT INITIAL. + SELECT SINGLE vbund INTO lv_vbund FROM kna1 WHERE kunnr = kna1-kunnr. +ELSEIF lfa1-lifnr IS NOT INITIAL. + SELECT SINGLE vbund INTO lv_vbund FROM lfa1 WHERE lifnr = lfa1-lifnr. +ENDIF. + +vbund = lv_vbund. " Assign the value to the additional field +``` + +SAP ist, was Queries angeht, eine mühsame Angelegenheit, wenn man keine Berechtigung oder Ahnung hat als Developer. + +1. Infoset anlegen. Da bildet man die Beziehung zwischen Tabellen mittels JOIN ab. Aber es ist sehr limitiert. Im konkreten Fall habe ich das gebaut: + +![SQ02 InfoSet Berechnete Spalte](../resources/images/sap-tricks-sq02-4.jpg) + +1. Query bauen. Hier gibt es keine CASE-WHEN-Funktion. Im konkreten Fall hätte ich dann 2 Spalten mit VBUND gehabt, einmal die aus der LFA1 und einmal die aus der KNA1. Ich will aber je nachdem, was es ist, was ich da gerade selektiere, das Ergebnis in einer Spalte haben. + +Das ABAP-Code-Schnipselchen ist der Inhalt eines sogenannten Zusatzfeldes. Dies ist eine berechnete Spalte im SAP-Slang. Es wird also in der Spalte geschaut, haben wir eine KNA1 oder eine LFA1 und dann je nachdem, was es ist, die VBUND genommen. + +## Änderungshistorie BP Geschäftspartner Felder + +Transaktion BP: + +- Geschäftspartner auswählen +- Feld suchen, das man ansehen will und Text markieren +- Zusätze > Änderungshistorie > Für dieses Feld + +![Änderungshistorie](../resources/images/sap-tricks-aenderungshistorie-1.png) + +![Änderungshistorie](../resources/images/sap-tricks-aenderungshistorie-2.png) + +## SAP + +Fenaco setzt mit ihrem Partner Scheer Group auf SAP Data Services. Das soll Migrationsprozesse ETL können und trotz hoher Kosten Sinn ergeben. + +[Technology Blogs by Members - SAP MDG data migration – Part 3](https://community.sap.com/t5/technology-blogs-by-members/sap-mdg-data-migration-part-3/ba-p/13446157) +[Technology Blogs by Members - SAP MDG Consolidation data import: The ETL way](https://community.sap.com/t5/technology-blogs-by-members/sap-mdg-consolidation-data-import-the-etl-way/ba-p/13445621) diff --git a/network.md b/network.md index 4fb9bae..17f82de 100644 --- a/network.md +++ b/network.md @@ -1,15 +1,140 @@ # Network +## vnet List + +List of vnets (latest version see Unifi console): + +| Name | VLAN ID | Router | Subnet | Azure vnet | +| --- | --- | --- | --- | --- | +| Default | 1 | prd-unifi-1 | 192.168.1.0/24 | N/A | +| Management | 2 | prd-unifi-1 | 192.168.2.0/24 | N/A | +| Clients | 3 | prd-unifi-1 | 192.168.3.0/24 | N/A | +| Server | 4 | prd-unifi-1 | 192.168.4.0/24 | N/A | +| IoT | 5 | prd-unifi-1 | 192.168.5.0/24 | 10.5.0.0/16 | +| Guests | 6 | prd-unifi-1 | 192.168.6.0/24 | N/A | +| Volt - Development | 7 | prd-unifi-1 | 192.168.7.0/24 | N/A | +| Var - Testing | 8 | prd-unifi-1 | 192.168.8.0/24 | N/A | +| Watt - Production | 9 | prd-unifi-1 | 192.168.9.0/24 | N/A | + Tasks: - Define Networks - OK Ranges definieren - OK Verteilen, was wohin kommt - - VLAN IDs statisch besser als dynamisch - - DNS definieren (fix vs. dynamisch) + - OK VLAN IDs statisch besser als dynamisch + - OK DNS definieren (fix vs. dynamisch) - Gateway Settings - Auto Update - Block outgoing DNS - Plugins wie OPNSense CrowdSec -![Basic network structure](resources/diagrams/network.png) +## Traefik load balancing + +Aparently due to these issues: + +- [Traefik intercepts TLS challenge in nested architecture (with TLS passthrough)](https://community.traefik.io/t/traefik-intercepts-tls-challenge-in-nested-architecture-with-tls-passthrough/23155/4) +- [Traefik GitHub Issue #10684](https://github.com/traefik/traefik/issues/10684) + +we might need to update our approach. As far as I understood it will be required to use a primary Traefik that does no ACME challanging at all. And thus either create an additional instance for handling separate connections to Proxmox und what all is overarching. + + +::: mermaid +graph LR + A[Internet] -->|ISP Connection| TRA[Traefik
*.amp.jambor.pro
Old version 2.11.0] + + TRA --> TRB[Traefik Dashboard] + TRA --> PRX[Proxmox Servers] + TRA --> LX1[LXC CouchDB] + TRA --> LX2[LXC Flightradar] + + subgraph "direct connections" + TRB + PRX + LX1 + LX2 + end + + TRA --> TRVO[Traefik] + + subgraph "*.volt.jambor.pro Development" + TRVO --> DCD[Docker host] + TRVO --> LXD[LXC Container] + end + + TRA --> TRVA[Traefik] + + subgraph "*.var.jambor.pro Testing" + TRVA --> DCT[Docker host] + TRVA --> LXT[LXC Container] + end + + TRA --> TRW[Traefik ] + + subgraph "*.watt.jambor.pro Production" + TRW --> DCP[Docker host] + TRW --> LXW[LXC Container] + end + +::: + +## Network diagram + + +::: mermaid +graph LR + A[Internet] -->|ISP Connection| ND1[Gateway
gw-jj-nar-prd-opr-1] + + subgraph "On-Prem Hub (VLAN ID 1)" + ND1 -->|VPN Tunnel to Azure| C[VPN Gateway] + ND1 --> D[Firewall & Security Policies] + ND2[Switch
sw-jj-nar-prd-opr-1] + ND3[Access Point
ap-jj-nar-prd-opr-0] + ND4[Access Point
ap-jj-nar-prd-opr-1] + ND5[Access Point
ap-jj-nar-prd-opr-2] + ND6[Access Point
ap-jj-nar-prd-opr-3] + end + + subgraph "On-Premises Spoke Networks" + D --> V2[Management VLAN ID 2] + V2 --> V201[Supermicro] + V2 --> V202[prd-proxmox-1] + V2 --> V203[prd-proxmox-2] + D --> V3[Clients VLAN 3] + V3 --> V301[Mobiles] + V3 --> V302[Laptops] + V3 --> V303[Apple TV] + V3 --> V304[HomePods] + D --> V4[Servers VLAN 4] + V4 --> V401[Legacy unneeded in future
will be in VLAN 7/8/9] + D --> V5[IoT VLAN 5 - Isolated 🔒] + V5 --> V501[Home infrastructure] + V5 --> V502[Loxone] + V5 --> V503[Home Assistant] + D --> V6[Guests VLAN 6] + V6 --> V601[Friends visting] + D --> V10[Guests VLAN 10] + V10 --> V1001[Customers of rented
out flat] + + end + + subgraph "On-Premises Workload Spoke Networks" + D --> O[*.volt.* VLAN ID 7] + D --> P[*.war.* VLAN 8] + D --> Q[*.watt.* VLAN 9] + end + + C -->|VPN Tunnel| J[Azure VPN Gateway] + + subgraph "Azure Hub" + J --> K[Azure Firewall] + end + + subgraph "Azure Workload Spoke Networks" + K --> L[Spoke 1: *.volt.*] + K --> M[Spoke 2: *.var.*] + K --> N[Spoke 3: *.watt.*] + K --> R[Spoke 4: IoT] + end +::: + + diff --git a/programming-language-recommendations.md b/programming-language-recommendations.md new file mode 100644 index 0000000..d6b2e32 --- /dev/null +++ b/programming-language-recommendations.md @@ -0,0 +1,106 @@ +# Programming language recommendations + +## Azure DevOps Pipeline + +The following programming languages are included in the comparison: + +- [C# (.NET)](https://learn.microsoft.com/dotnet/csharp/) +- Shell Scripting [Bash](https://www.gnu.org/software/bash/) / [Zsh](https://www.zsh.org) +- [Python](https://www.python.org) +- [JavaScript](https://ecma-international.org/publications-and-standards/standards/ecma-262/) / [TypeScript](https://www.typescriptlang.org) +- [Ruby](https://www.ruby-lang.org) +- [Go](https://go.dev) +- [PowerShell](https://learn.microsoft.com/powershell/) +- [Java](https://www.java.com) + +### Comparsion of the most important points + +Legend for simple scales used below + +- Extensions: Limited, Moderate, Extensive +- Simplicity in YAML: Easy, Medium, Complex +- Installation of Dependencies: Easy, Moderate, Complex +- Execution Speed: High, Moderate, Low +- Readability: Easy, Moderate, Hard + +| Feature | C# | Shell (Bash/Zsh) | Python | JavaScript / TypeScript | Ruby | Go | PowerShell | Java | +| ---- | ---- | ---- | ---- | ---- | ---- | ---- | ---- | ---- | +| 1. Data processing (PDF) | Extension needed (e. g., [iTextSharp](https://github.com/itext/itextsharp), [PDFSharp](https://github.com/empira/PDFsharp)) | External tool (e. g., [Ghostscript](https://www.ghostscript.com/)) | Extension needed (e. g., [PyPDF2](https://pypi.org/project/PyPDF2/), [ReportLab](https://pypi.org/project/reportlab/)) | Extension needed (e. g., [pdf-lib](https://github.com/Hopding/pdf-lib), [PDFKit](https://github.com/foliojs/pdfkit)) | Extension needed (e. g., [Prawn](https://github.com/prawnpdf/prawn)) | Extension needed (e. g., [go-pdf](https://github.com/signintech/gopdf)) | Extension needed (e. g., same .NET PDF libs invoked from PowerShell) | Extension needed (e. g., [iText7](https://github.com/itext/itext7), [Apache PDFBox](https://pdfbox.apache.org/)) | +| 2. Data processing (AsciiDoc) | Extension needed (e. g., Asciidoctor .NET wrappers) | External tool (e. g., [Asciidoctor](https://asciidoctor.org/)) | Extension needed (e. g., [asciidoc-py3](https://github.com/asciidoc/asciidoc-py3)) | Extension needed (e. g., [@asciidoctor/core](https://www.npmjs.com/package/@asciidoctor/core)) | Extension needed (e. g., [Asciidoctor](https://asciidoctor.org/)) | Extension needed (e. g., use CLI or wrappers for [Asciidoctor](https://asciidoctor.org/)) | External tool ([Asciidoctor](https://asciidoctor.org/)) or .NET-based wrappers | Extension needed (e. g., [AsciidoctorJ](https://github.com/asciidoctor/asciidoctorj)) | +| 3. Data processing (JSON) | Built-in ([System.Text.Json](https://learn.microsoft.com/dotnet/api/system.text.json)) | External tool (e. g., [jq](https://jqlang.github.io/jq/)) | Built-in ([json](https://docs.python.org/3/library/json.html)) | Built-in (`JSON.parse`, `JSON.stringify`; in Node, no extra install needed) | Built-in ([json](https://docs.ruby-lang.org/en//master/JSON.html)) | Built-in (encoding/json) | Built-in (`ConvertFrom-Json`, `ConvertTo-Json` in modern PowerShell) | Extension/library commonly used (e. g., [Jackson](https://github.com/FasterXML/jackson), `org.json`) | +| 4. Data processing (YAML) | Extension needed (e. g., [YamlDotNet](https://github.com/aaubry/YamlDotNet)) | External tool (e. g., [yq](https://github.com/mikefarah/yq)) | Extension needed (e. g., [PyYAML](https://pypi.org/project/PyYAML/)) | Extension needed (e. g., [js-yaml](https://www.npmjs.com/package/js-yaml)) | Built-in ([Psych](https://docs.ruby-lang.org/en//master/Psych.html)) | Extension needed (e. g., gopkg.in/yaml.v3) | Built-in in newer PowerShell versions (`ConvertFrom-Yaml`, `ConvertTo-Yaml`) | Extension needed (e. g., SnakeYAML) | +| 5. Data processing (XML) | Built-in ([System.Xml](https://learn.microsoft.com/dotnet/api/system.xml)) | External tool (e. g., xmlstarlet) | Built-in ([xml](https://docs.python.org/3/library/xml.html)) | Mostly extension (e. g., [xml2js](https://www.npmjs.com/package/xml2js), [fast-xml-parser](https://www.npmjs.com/package/fast-xml-parser)) | Built-in (REXML, Nokogiri) | Built-in (encoding/xml) | Built-in (native `[xml]` type accelerator) | Built-in (javax.xml, org.w3c.dom, plus standard libraries) | +| 6. Extensions (libraries / packages) | Extensive (NuGet) | Extensive (rich set of CLI tools, though not “extensions” in the same sense) | Extensive (PyPI) | Extensive (npm is one of the largest ecosystems) | Extensive (RubyGems) | Extensive (Go Modules) | Moderate (PowerShell Gallery) | Extensive (Maven Central, Gradle plugins) | +| 7. Simplicity in YAML usage | Medium (third-party library but straightforward) | Complex (usually rely on yq or custom scripts) | Easy (with PyYAML) | Medium (need js-yaml, usage is direct in Node/TS) | Easy (built-in Psych) | Medium (import 3rd-party package, usage is simple) | Easy (native cmdlets in newer versions) | Medium (SnakeYAML is straightforward, but an extra lib) | +| 8. Must be compiled? | Yes (C# -> .NET IL) | No (interpreted scripts) | No (interpreted) | JS: No (interpreted), TS: Yes (transpiles to JS) | No (interpreted) | Yes (compiled to native binaries) | No (interpreted on .NET runtime) | Yes (compiled to JVM bytecode) | +| 9. Cross-Platform | Yes (with .NET Core/.NET 5+) | Yes (native to Unix-like, plus Windows via WSL or separate install) | Yes (Windows, macOS, Linux) | Yes (Node.js or browser; TS runs where JS runs) | Yes (Windows, macOS, Linux) | Yes (Windows, macOS, Linux, others) | Yes (PowerShell Core 6+ is cross-platform) | Yes (JVM on Windows, macOS, Linux, etc.) | +| 10. Simple installation of dependencies | Moderate (NuGet + .NET CLI or Visual Studio) | Moderate (install packages/tools via apt, yum, brew, etc.) | Easy (pip, Conda, etc.) | Easy (npm, yarn, etc.) | Easy (RubyGems, Bundler) | Easy (Go modules, go get) | Moderate (PowerShell Gallery + extra config) | Moderate (Maven, Gradle; straightforward but verbose) | +| 11. Licensing | Open-source .NET (MIT for .NET Core); older .NET frameworks under MS licenses | GPL (GNU Bash) | PSF License (Python Software Foundation) | JavaScript is an ECMA standard; TypeScript is Apache 2.0 by Microsoft | Dual License (Ruby License/BSD) | BSD-style (Go is open source under a permissive license) | MIT License (for PowerShell Core; Windows PS is proprietary) | GPL v2 + Classpath (OpenJDK); Oracle JDK has different commercial terms | +| 12. Provider / Owner | Microsoft (language + runtime) | GNU Project (part of GNU utilities) | Python Software Foundation | ECMA standard for JS; Microsoft for TS | Yukihiro “Matz” Matsumoto / community | Google (initially) + open source community | Microsoft (PowerShell) | Oracle + open source community | +| 13. Execution speed | High (JIT on .NET, typically quite fast) | Low (relies on external tools; not optimized for heavy computation) | Moderate (interpreted, can be fast but slower than C#/Go/Java) | Moderate (Node’s V8 engine is JIT-compiled; usually slower than fully compiled languages) | Moderate (CRuby slower; newer versions have partial JIT) | High (compiled to native) | Moderate (.NET-based, typically good performance but overhead in interactive scenarios) | High (JIT-compiled by the JVM; often on par with C#) | +| 14. Code comprehension & readability | Moderate (C-style syntax, can be verbose) | Hard (complex quoting, expansions, and nuances in Bash) | Easy (clean, minimal boilerplate) | Moderate (JS can be flexible/loose; TS adds structure but extra overhead) | Easy (expressive, some “magic” features) | Easy (simple, explicit, fewer features) | Moderate (familiar C#-like syntax + cmdlet conventions) | Moderate (verbose, strongly typed, boilerplate-heavy) | +| 15. Certification available (employee) | Yes (Microsoft .NET/C# certs) | Indirect (part of broader Linux certifications like LPIC, RHCSA) | Yes (e. g., PCAP) | No official (some vendor-specific or full-stack certs may include JS/TS) | No official (third-party training only) | No official (no widely recognized Go cert; some third-party) | Yes (covered in broader MS certs, though not strictly “PowerShell-only”) | Yes (Oracle Certified Professional Java Programmer, etc.) | +| 16. Debugging capabilities | Strong (Visual Studio, VS Code with C# extension) | Limited (VS Code has bash-debug, but fewer features) | Strong (VS Code, PyCharm, pdb, etc.) | Strong (VS Code debugger for JS/TS, Chrome DevTools, Node Debugger) | Moderate (VS Code Ruby extensions, RubyMine) | Strong (VS Code Go extension + Delve) | Strong (VS Code PowerShell extension with integrated debugger) | Strong (VS Code Java extension, IntelliJ, Eclipse) | +| 17. Testing framework | Yes (NUnit, xUnit, MSTest) | Yes (e. g., shUnit2, Bats) | Yes (unittest, pytest, nose, etc.) | Yes (Jest, Mocha, Jasmine, etc. for JS; Mocha/Jest + ts-node for TS) | Yes (RSpec, Minitest) | Yes (testing in stdlib) | Yes (Pester for PowerShell) | Yes (JUnit, TestNG, etc.) | + +### Recommended Language: [Python](https://www.python.org) + +Python has most features or the best value on scale. The most benefits compared to the other languages: + +- **Data Handling**: pull data, parse it, and then format it. +- **Document Generation**: The libraries for data presentation are fast and simple. +- **Dependencies**: are easy handled with requirments.txt. +- **Virtual Environments**: allows different Python versions to be used in the same pipeline. +- **REST APIs**: can be simple used with the Pythons requests library. + +### Example Workflows for Python + +#### Install Dependencies + +Use a pipeline task to install Python (if not already on the agent) and the required libraries. + +```yaml +- task: UsePythonVersion@0 + inputs: + versionSpec: '3.x' +- script: | + pip install requests python-docx reportlab jinja2 +``` + +#### Fetch Work Items + +Write a Python script to call the Azure DevOps REST API to retrieve Work Items. + +```python +import requests + +# Example: Get Work Items from Azure DevOps +devops_organization_url = "https://dev.azure.com/YOUR_ORG" +project = "YOUR_PROJECT" +api_version = "6.0" +query_id = "YOUR_QUERY_ID" + +response = requests.get( + f"{devops_organization_url}/{project}/_apis/wit/wiql/{query_id}?api-version={api_version}", + auth=('PAT_USERNAME', 'PAT_TOKEN') # or use other Auth methods +) +work_items_data = response.json() +``` + +#### Generate Compliance Documents + +Convert the retrieved data into the document format of your choice. + +```python +from docx import Document +from docx.shared import Inches + +document = Document() +document.add_heading('Compliance Report', level=1) + +for item in work_items_data["workItems"]: + document.add_heading(f'Work Item ID: {item["id"]}', level=2) + # Additional data insertion here... + +document.save('ComplianceReport.docx') +``` diff --git a/requirements-engineering-process.md b/requirements-engineering-process.md new file mode 100644 index 0000000..65ef331 --- /dev/null +++ b/requirements-engineering-process.md @@ -0,0 +1,201 @@ +# Requirements Engineering Process + +This document provides a comprehensive overview of our requirements engineering process. It is designed to help experienced professionals understand how we collect, document, and manage requirements using Azure DevOps. By adhering to this guide, we ensure consistency, traceability, and compliance throughout our projects. + +The requirements engineering proicess is tightly bound to the [traceability concept](traceability-concept.md) and the [requirements gathering interview](requirements-gathering-interview.md). + +Requirements are work items in Azure DevOps and the summarizing AsciiDoc document is located within the [docs-requirements](https://dev.azure.com/ypsag/ITSandbox/_git/docs-requirements) repository. + +## Overview + +Our requirements engineering process is a structured approach that takes us from the initial stakeholder conversations to a finalized set of requirements. We emphasize: + +- **Clarity and Testability:** Requirements should be simple statements that are testable. We are writing test cases along with the stakeholder requirements. +- **Traceability:** Maintaining clear links from requirements through to implementation and testing and back. +- **Compliance:** Ensuring all regulatory requirements are identified and addressed. +- **Stakeholder Engagement:** Actively involving stakeholders throughout the process. +- **Risk Management:** Identifying and mitigating risks associated with requirements. Recording them linked to the requirements. + +We use **Azure DevOps** with the **CMMI process template** to manage our Work Items, facilitating collaboration and traceability. + +## Process Steps + +### 1. Initial Engagement + +**Objective:** Establish a foundational understanding of the project and build relationships with stakeholders. + +- **Activities:** + - Conduct initial meetings to understand project vision and objectives with sponsors or, if already appointed, project managers. + - Establish communication channels and protocols. E.g. use Teams channels, Slack or such for quick communication. + - Set expectations for the requirements gathering process. + +### 2. Stakeholder Identification + +**Objective:** Identify all individuals and groups who have an interest in or influence over the project. + +- **Activities:** + - Create a stakeholder register. This should be part of the summarizing AsciiDoc document. + - Analyze stakeholder roles, interests, and influence. + - Prioritize stakeholders based on their impact on requirements. + +### 3. Requirements Gathering + +**Objective:** Collect detailed requirements from stakeholders. + +- **Activities:** + - Conduct interviews using our [Interview Questionnaire](requirements-gathering-interview.md) (refer to the separate file). + - Hold workshops and brainstorming sessions. + - Observe existing systems and workflows. + - Capture raw requirements and create initial Requirement Work Items in Azure DevOps. + - Demonstrate and teach how unexpierenced stakeholders can review requirements and their current state in Azure DevOps. + +### 4. Requirements Documentation + +**Objective:** Document requirements clearly and comprehensively. + +- **Activities:** + - Use the "Requirement" Work Item in Azure DevOps to record each requirement. + - Include detailed descriptions, affected regulations, and stakeholder information. + - Ensure each requirement is a simple, testable statement. Write test cases along with the requirements. + - Add acceptance criteria. + +### 5. Requirements Analysis + +**Objective:** Refine and validate the collected requirements. + +- **Activities:** + - Analyze requirements for clarity, completeness, and feasibility. + - Identify and resolve conflicts or duplicates. + - Validate requirements with stakeholders. + +### 6. Traceability + +**Objective:** Maintain end-to-end traceability of requirements through the project lifecycle. + +- **Activities:** + - Link Requirement Work Items to related Specifications, Design Documents, Implementation Tasks, Test Cases, and other artifacts in Azure DevOps. + - Use a Traceability Matrix to map relationships (see [traceability concept](traceability-concept.md)). + +### 7. Requirements Management + +**Objective:** Manage changes to requirements systematically. + +- **Principles:** + - **Immutability of Requirements:** Once a requirement is baselined, it should not be altered. If changes are needed, deprecate the old requirement and create a new version. + - **Version Control:** Use Azure DevOps Work Items to link older versions with their successors and maintain history. +- **Activities:** + - Implement a change control process for requirements updates. + - Communicate changes to all stakeholders. + +### 8. Risk Management + +**Objective:** Identify and mitigate risks associated with requirements early in the project. + +- **Activities:** + - Document risks as separate work item of type Risk and link it to the Requirement Work Items. + - Assess the impact and likelihood of each risk. + - Develop mitigation strategies and documente them along with the risk. + +## Azure DevOps Guidelines + +### Requirement Work Item Structure + +When creating a Requirement Work Item in Azure DevOps, ensure the following fields are populated: + +- **Title:** Clear and concise summary of the requirement. +- **Stakeholders:** Names or roles of stakeholders associated with the requirement. +- **Description:** Detailed explanation, including purpose and background. +- **Affected Regulations:** List any laws, standards, or regulations impacting the requirement. +- **Acceptance Criteria:** Conditions that must be met for the requirement to be considered fulfilled. +- **Test Cases:** Linked test cases for validating the requirement. +- **Attachments:** Include supporting documents if necessary. + +### Maintaining Traceability + +Traceability is crucial for tracking requirements through all stages of development. We use a Traceability Matrix to map requirements to other project artifacts. As of now the document is created manually in AsciiDoc format. We aim to automate this process in the future. + +**Template:** + +| Requirement ID | Requirement Title | Affected Regulations | Stakeholders | Test Case ID(s) | +|----------------|-------------------|----------------------|--------------|-----------------| +| RQ-001 | [Title] | [Regulations] | [Roles] | TC-001 | + +- **Requirement ID:** Unique identifier, we are using the Azure DevOps work item id. +- **Requirement Title:** Summary of the requirement. +- **Affected Regulations:** Relevant laws, standards, or regulations. We are using a link to the requirement work item representing the regulation in Azure DevOps. +- **Test Case ID(s):** Linked test cases for validation, we are using the Azure DevOps Test Case id. + +## Best Practices and additional Notes + +- **Simple and Testable Statements:** Write requirements that are clear and can be verified through testing. +- **Regulatory Compliance:** When regulations affect a requirement, include specific clauses or references. +- **Versioning:** Do not alter baselined requirements. Deprecate and replace with new versions as needed. +- **Stakeholder Engagement:** Keep stakeholders involved throughout the process for validation and feedback. +- **Risk Identification:** Address risks early by documenting them alongside requirements. +- **Consistent Terminology:** Use standard terms and definitions to avoid confusion. +- **Communication:** Utilize Azure DevOps **Discussion** section in Work Items for conversations and decisions. Set up notifications for stakeholders to keep them informed of updates. +- **Training:** Familiarize yourself with the Azure DevOps CMMI process template and our customized fields. Stay updated on best practices in requirements engineering. +- **Regulatory Awareness:** Stay informed about regulations relevant to our projects (e.g., GDPR, HIPAA). Consult with compliance officers when in doubt. + +## LLM Prompt example + +```text +Our situation is as follows: +- We are a med tech company producing physical devices incl. embedded hardware and software, mobile apps connected via bluetooth and cloud components like user management device management for those mobile apps +- We are the IT department offering IT infrastructure in the Azure cloud for those products. We are not creating the products and we are not responsible for the processes involved to create the products. We are only providing infrastructure for those products and are responsible for IT infrastructure changes. + +We have identified the following list of regulations as potentially relevant: +- ISO/IEC 62304 Medical device software - Software life cycle processes +- ISO/IEC 27001 Information technology - Security techniques - Information security management systems - Requirements +- ISO/IEC 27017 Information technology - Security techniques - Code of practice for information security controls based on ISO/IEC 27002 for cloud services +- ISO/IEC 27002 Information security, cybersecurity and privacy protection Information security controls +- ISO/IEC 27018 Information technology - Security techniques - Code of practice for protection of personally identifiable information (PII) in public clouds acting as PII processors +- ISO 14971 Medical devices Application of risk management to medical devices +- ISO 13485 Medical devices Quality management systems - Requirements for regulatory purposes +- FDA 21 CFR Part 820: Title 21 Code of Federal Regulations Part 820 - Quality System Regulation +- FDA 21 CFR Part 11: Title 21 Code of Federal Regulations (CFR) Part 11 - Electronic Records; Electronic Signatures +- ALCOA+ Principles Compliance + +You are an expert in the field of requirements engineering in the regulated environment of a med tech company. I need your support in writing requirement. + +When writing requirements, use the following format to clearly articulate the need, the stakeholder’s perspective, the desired outcomes, and the rationale. Adhere strictly to this structure: + +When [condition or situation triggering the requirement], +As [stakeholder role], +I want [specific actions or outcomes to achieve]. +This ensures [reason or benefit for implementing the requirement]. + +Key Guidelines: +1. Condition or Situation: Clearly state when or under what circumstances the requirement applies. Use "When..." to frame this. +2. Stakeholder Role: Explicitly identify the stakeholder requesting the requirement. Use "As [stakeholder role]..." to reflect the stakeholder's voice. +3. Desired Outcomes: Use "I want..." to specify what the stakeholder expects or desires to be achieved. List actions or outcomes in a concise, actionable manner. +4. Rationale: Use "This ensures..." to explain why the requirement is important or what benefit it provides. +5. Add a list of relevant acceptance criteria +6. Add a list of test cases +7. Add a list of regulations that are relevant to the requirement inlcluding the requirement of the regulation and the implication of the requirement. + +Example: +Title: Document Authorship and Timestamps +When creating or modifying documents, +As IT QA CSV representative, +I want every document to: +- Record the author. +- Include timestamps for creation and all modifications. +This ensures compliance with the "Attributable" principle of ALCOA+. +Acceptance criteria: +- The author's name is recorded on every document. +- Timestamps are added for document creation and all modifications. +Test cases: +- Verify author name is captured on document creation. +- Verify timestamps are added for all document modifications. +List of regulations: +- ALCOA+ Principles Compliance + - Requirement: Attributable + - Implication: Every document must record the author and timestamps for creation and modifications. + +The input data is: + +Requirement title: >>fill in the title<< +Stakeholder: >>fill in the stakeholder role<< +Requirement description: >>fill in the requirement description in your words<< +``` diff --git a/requirements-gathering-interview.md b/requirements-gathering-interview.md index 7e1e582..adaaa20 100644 --- a/requirements-gathering-interview.md +++ b/requirements-gathering-interview.md @@ -1,63 +1,137 @@ -# Requirements Gathering Interview +# Requirements Gathering Interview Template -This guide should help anyone in our team to conduct a requirements gathering interview and collect information about the needs, desires, and constraints of stakeholders. +- The goal is to collect comprehensive information about stakeholders' needs, expectations, and constraints to inform project requirements accurately. These items can be initially basic and refined over time, but they should capture all identified requirements from the interview. How to Conduct a Requirements Gathering Interview -As an outcome we want to have Requirement work items in Azure DevOps. They can rather raw and empty but should be created for each requirement we gather. They can be refined later on. - -## How to Conduct a Requirement Gathering Interview? +## How to Conduct a Requirements Gathering Interview ### Preparation -- **Identify Stakeholders:** Determine who needs to be interviewed. This could include business-users, project sponsors and managers, and qa as well as csv staff. -- **Research:** Understand the background of the project and the implication and impact of the project on the stakeholders. -- **Define Objectives:** Know what you need to find out from the specific interview and stakeholder. Have clear focus and goals in mind. E.g. a business person will have different needs than a qa person. -- **Prepare Questions:** Use this list of questions as a basis, adjust it to the stakeholder’s roles and the project’s context. +- Identify Stakeholders: Determine who needs to be interviewed, such as business users, project sponsors, managers, QA personnel, and compliance staff. +- Research: Understand the project's background and its impact on stakeholders. +- Define Objectives: Clarify what information you need from each stakeholder based on their role. +- Prepare Questions: Customize the interview questions to fit the stakeholder's role and the project's context. +- Logistics: Schedule the interview and inform the stakeholder about its purpose and expected duration. -### Conducting the Interview +### During the Interview -- **Explain the Purpose:** Clearly explain why you are conducting the interview and what you hope to achieve. -- **Ask Open-Ended Questions:** Encourage detailed responses by asking questions that cannot be answered with a simple “yes” or “no”. -- **Take Notes:** Document the key points of the conversation as Requirements in Azure DevOps. +- Explain the Purpose: Start by clearly stating why you're conducting the interview and what you hope to achieve. +- Establish Rapport: Build a comfortable environment to encourage open communication. +- Ask Open-Ended Questions: Use questions that prompt detailed responses rather than simple yes/no answers. +- Active Listening: Pay attention to the stakeholder's answers and ask follow-up questions as needed. +- Document Responses: Take thorough notes or record the conversation (with permission) for accuracy. + +### After the Interview + +- Review Notes: Go over your notes promptly to ensure clarity and completeness. +- Create Requirement Work Items: For each requirement discussed, create a Requirement Work Item in Azure DevOps. +- Follow-Up: If necessary, reach out for clarification on any ambiguous points. +- Validation: Send a summary to the stakeholder to confirm understanding and accuracy. + +## Interview Questions ### General Questions -- Can you describe your role and how it relates to this project? -- What are the main goals you want to achieve with this project? -- Who are the end-users of this product/service? -- What problems or challenges are you currently facing that this project should address? +- Role Understanding: + Can you describe your role and responsibilities within the organization? +- Project Involvement: + How does this project relate to your work or department? +- Objectives: + What are the main goals you want to achieve with this project? +- End-Users: + Who will be the primary users of this system or service? +- Current Challenges: + What problems or challenges are you currently facing that this project should address? ### Functional Requirements -- What specific features or functions do you need the system to have? -- Can you walk me through a typical use case or workflow? -- Are there any specific tasks that the system must perform? +- Features and Functions: + What specific features or functionalities do you need the system to have? +- Processes: + Can you walk me through a typical workflow or use case? +- Tasks: + What tasks must the system enable users to perform? +- Automation: + Are there manual processes that could be automated through this system? ### Non-Functional Requirements -- What performance criteria should the system meet (e.g. speed, reliability)? -- Are there any security requirements or concerns we should be aware of? -- What are the scalability requirements for the system? +- Performance: + What performance criteria should the system meet (e.g., speed, responsiveness)? +- Security: + Are there any security requirements or concerns we should be aware of? +- Scalability: + How should the system handle growth in users or data volume? +- Reliability: + What uptime or availability is expected for the system? +- Compliance: + Are there industry standards or regulations we need to comply with? ### User Interface and Experience -- Do you have any preferences or standards for the user interface design? -- Are there any accessibility requirements we need to consider? -- What kind of user training or support will be necessary? +- Design Preferences: + Do you have any preferences or standards for the user interface design? +- Accessibility: + Are there any accessibility requirements we need to consider? +- User Training: + What kind of training or support will users need? +- Localization: + Will the system need to support multiple languages or regions? ### Data and Integration -- What data needs to be captured, stored, and processed by the system? -- Are there existing systems or databases that this project needs to integrate with? -- What data privacy and compliance requirements must be followed? +- Data Requirements: + What data needs to be captured, stored, and processed by the system? +- Existing Systems: + Are there current systems or databases that need to integrate with this project? +- Data Migration: + Is there existing data that needs to be migrated to the new system? +- Data Privacy: + What data privacy and protection requirements must be followed? ### Constraints and Assumptions -- Are there any budget or time constraints we need to consider? -- What assumptions are we making about this project that need to be validated? -- Are there any risks or potential obstacles you foresee? +- Budget and Timeline: + Are there any budget or time constraints we need to consider? +- Technological Constraints: + Are there specific technologies or platforms we must use or avoid? +- Assumptions: + What assumptions are we making that need to be validated? +- Resource Availability: + What resources (personnel, equipment) are available or limited? ### Success Criteria -- How will you measure the success of this project? -- What outcomes are most important to you? -- Are there any key performance indicators (KPIs) that we should track? +- Measuring Success: + How will you measure the success of this project? +- Key Outcomes: + What outcomes are most important to you? +- KPIs: + Are there specific Key Performance Indicators we should track? + +### Risks and Challenges + +- Potential Obstacles: + What risks or potential challenges do you foresee? +- Risk Mitigation: + Do you have suggestions for mitigating these risks? +- Dependencies: + Are there dependencies on other projects or initiatives? + +## Tips for Effective Interviews + +- Be Prepared: + Familiarize yourself with the stakeholder's background and the project's context. +- Build Trust: + Be respectful and professional to encourage honest and open dialogue. +- Clarify and Summarize: + Restate key points to confirm understanding. +- Stay Flexible: + Be prepared to explore new topics that arise during the conversation. +- Avoid Jargon: + Use clear language and explain any necessary technical terms. +- Manage Time: + Keep track of time to cover all essential questions without rushing. +- Seek Permission: + Always ask before recording the interview or sharing sensitive information. +- Follow Ethical Guidelines: + Respect confidentiality and handle all information appropriately. diff --git a/resources/diagrams/branching-code-review-process.drawio b/resources/diagrams/branching-code-review-process.drawio new file mode 100644 index 0000000..ce35b1a --- /dev/null +++ b/resources/diagrams/branching-code-review-process.drawio @@ -0,0 +1,108 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/resources/diagrams/branching-code-review-process.png b/resources/diagrams/branching-code-review-process.png new file mode 100644 index 0000000..9d64d76 Binary files /dev/null and b/resources/diagrams/branching-code-review-process.png differ diff --git a/resources/diagrams/network.drawio b/resources/diagrams/network.drawio index 49a88b1..b225ae2 100644 --- a/resources/diagrams/network.drawio +++ b/resources/diagrams/network.drawio @@ -1,136 +1,234 @@ - + - + - - + + - - - - + + + + + + + + + + + + + + + - + - + - - + + - - + + - - + + - - + + - - + + - - + + - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + + - + - - + + - - + + - - + + - - + + - - + + - - + + - - - - - + + - + - + - - + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/resources/diagrams/network.png b/resources/diagrams/network.png index fca6a12..25aa98e 100644 Binary files a/resources/diagrams/network.png and b/resources/diagrams/network.png differ diff --git a/resources/diagrams/traceability-concept.drawio b/resources/diagrams/traceability-concept.drawio new file mode 100644 index 0000000..9650de5 --- /dev/null +++ b/resources/diagrams/traceability-concept.drawio @@ -0,0 +1,285 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/resources/diagrams/traceability-concept.png b/resources/diagrams/traceability-concept.png new file mode 100644 index 0000000..827aae5 Binary files /dev/null and b/resources/diagrams/traceability-concept.png differ diff --git a/resources/images/sap-tricks-aenderungshistorie-1.png b/resources/images/sap-tricks-aenderungshistorie-1.png new file mode 100644 index 0000000..8fddae2 Binary files /dev/null and b/resources/images/sap-tricks-aenderungshistorie-1.png differ diff --git a/resources/images/sap-tricks-aenderungshistorie-2.png b/resources/images/sap-tricks-aenderungshistorie-2.png new file mode 100644 index 0000000..27f5cbf Binary files /dev/null and b/resources/images/sap-tricks-aenderungshistorie-2.png differ diff --git a/resources/images/sap-tricks-se16n-1.jpg b/resources/images/sap-tricks-se16n-1.jpg new file mode 100644 index 0000000..8670223 Binary files /dev/null and b/resources/images/sap-tricks-se16n-1.jpg differ diff --git a/resources/images/sap-tricks-se16n-2.png b/resources/images/sap-tricks-se16n-2.png new file mode 100644 index 0000000..3159901 Binary files /dev/null and b/resources/images/sap-tricks-se16n-2.png differ diff --git a/resources/images/sap-tricks-se16n-3.png b/resources/images/sap-tricks-se16n-3.png new file mode 100644 index 0000000..9751404 Binary files /dev/null and b/resources/images/sap-tricks-se16n-3.png differ diff --git a/resources/images/sap-tricks-se16n-4.png b/resources/images/sap-tricks-se16n-4.png new file mode 100644 index 0000000..cadb822 Binary files /dev/null and b/resources/images/sap-tricks-se16n-4.png differ diff --git a/resources/images/sap-tricks-sq0102-1.png b/resources/images/sap-tricks-sq0102-1.png new file mode 100644 index 0000000..3199275 Binary files /dev/null and b/resources/images/sap-tricks-sq0102-1.png differ diff --git a/resources/images/sap-tricks-sq0102-2.png b/resources/images/sap-tricks-sq0102-2.png new file mode 100644 index 0000000..3aa0b45 Binary files /dev/null and b/resources/images/sap-tricks-sq0102-2.png differ diff --git a/resources/images/sap-tricks-sq02-1.png b/resources/images/sap-tricks-sq02-1.png new file mode 100644 index 0000000..102d165 Binary files /dev/null and b/resources/images/sap-tricks-sq02-1.png differ diff --git a/resources/images/sap-tricks-sq02-2.png b/resources/images/sap-tricks-sq02-2.png new file mode 100644 index 0000000..4871efb Binary files /dev/null and b/resources/images/sap-tricks-sq02-2.png differ diff --git a/resources/images/sap-tricks-sq02-3.png b/resources/images/sap-tricks-sq02-3.png new file mode 100644 index 0000000..716097e Binary files /dev/null and b/resources/images/sap-tricks-sq02-3.png differ diff --git a/resources/images/sap-tricks-sq02-4.jpg b/resources/images/sap-tricks-sq02-4.jpg new file mode 100644 index 0000000..25239ac Binary files /dev/null and b/resources/images/sap-tricks-sq02-4.jpg differ diff --git a/resources/images/sap-tricks-theme-1.png b/resources/images/sap-tricks-theme-1.png new file mode 100644 index 0000000..91cf78d Binary files /dev/null and b/resources/images/sap-tricks-theme-1.png differ diff --git a/resources/images/sap-tricks-theme-2.png b/resources/images/sap-tricks-theme-2.png new file mode 100644 index 0000000..a0592c4 Binary files /dev/null and b/resources/images/sap-tricks-theme-2.png differ diff --git a/resources/scripts/ocr.bash b/resources/scripts/ocr.bash new file mode 100755 index 0000000..e1cfa2b --- /dev/null +++ b/resources/scripts/ocr.bash @@ -0,0 +1,24 @@ +#!/usr/bin/env bash + +# Simple script to OCR multiple PDFs using ocrmypdf. +# Usage: ocrpdf.sh input.pdf + +if [ $# -eq 0 ]; then + echo "Usage: $(basename "$0") input.pdf" + exit 1 +fi + +for f in "$@"; do + # Make sure it's a PDF + if [[ "$f" == *.pdf ]]; then + dir=$(dirname "$f") + base=$(basename "$f" .pdf) + out="${dir}/${base}-ocr.pdf" + + echo "Processing $f -> $out" + ocrmypdf --redo-ocr "$f" "$out" + echo "Created: $out" + else + echo "Skipping non-PDF file: $f" + fi +done diff --git a/resources/scripts/release-notes.bash b/resources/scripts/release-notes.bash old mode 100644 new mode 100755 index 7dc8d62..eeeeece --- a/resources/scripts/release-notes.bash +++ b/resources/scripts/release-notes.bash @@ -1,14 +1,46 @@ #!/bin/bash -# Check for the "final" flag in command line arguments -finalFlag=false +# Usage function to display help +usage() { + echo "Usage: $0 [OPTIONS]" + echo "" + echo "Options:" + echo " -f, --final Use the last 'final' tag as the start point." + echo " -n, --newest Display commits from newest to oldest." + echo " -h, --help Display this help and exit." + echo "" + echo "This script generates release notes from git commits based on tags." + echo "By default, it lists commits from the last tag to HEAD, sorted from oldest to newest." +} + +# Check for command line arguments for arg in "$@"; do - if [[ "$arg" == "final" ]]; then - finalFlag=true - break - fi + case "$arg" in + -f|--final) + finalFlag=true + ;; + -n|--newest) + newestFirst=true + ;; + -h|--help) + usage + exit 0 + ;; + *) + echo "Unknown option: $arg" + usage + exit 1 + ;; + esac done +# Adjust sort order based on the flag +if [ "${newestFirst}" = true ]; then + sortOrder="cat" +else + sortOrder="tail -r" +fi + # Get tags sorted by creation date tags=$(git tag --sort=creatordate) @@ -23,7 +55,7 @@ fi len=${#tagArray[@]} # Find the last tag or the last "final" tag based on the flag -if $finalFlag; then +if [ "${finalFlag}" = true ]; then for (( i=len-1; i>=0; i-- )); do if [[ "${tagArray[i]}" == *"final"* ]]; then latestTag="${tagArray[i]}" @@ -48,10 +80,10 @@ echo "## Release Notes from $latestTag to this release" echo "" # Fetch commit logs from the latest tag to HEAD and categorize them -newfeatures=$(git log "$latestTag"..HEAD --pretty=format:"%s" | grep 'new:' | sed 's/new:/-/g' | sort | uniq) -updatedfeatures=$(git log "$latestTag"..HEAD --pretty=format:"%s" | grep 'update:' | sed 's/update:/-/g' | sort | uniq) -fixedfeatures=$(git log "$latestTag"..HEAD --pretty=format:"%s" | grep 'fix:' | sed 's/fix:/-/g' | sort | uniq) -deletedfeatures=$(git log "$latestTag"..HEAD --pretty=format:"%s" | grep 'delete:' | sed 's/delete:/-/g' | sort | uniq) +newfeatures=$(git log "$latestTag"..HEAD --pretty=format:"%s" | grep 'new:' | sed 's/new:/- /g' | sort | uniq | $sortOrder) +updatedfeatures=$(git log "$latestTag"..HEAD --pretty=format:"%s" | grep 'update:' | sed 's/update:/- /g' | sort | uniq | $sortOrder) +fixedfeatures=$(git log "$latestTag"..HEAD --pretty=format:"%s" | grep 'fix:' | sed 's/fix:/- /g' | sort | uniq | $sortOrder) +deletedfeatures=$(git log "$latestTag"..HEAD --pretty=format:"%s" | grep 'delete:' | sed 's/delete:/- /g' | sort | uniq | $sortOrder) # Output formatted commit lists echo "New Features:" @@ -88,3 +120,4 @@ if [ -z "$deletedfeatures" ]; then else echo "$deletedfeatures" fi +echo "" diff --git a/sbom.md b/sbom.md new file mode 100644 index 0000000..0098e2a --- /dev/null +++ b/sbom.md @@ -0,0 +1,169 @@ +# Software Bill of Material (SBOM) + +## Work order + +### Description + +**When:** evaluating and selecting Software Bill of Materials (SBOM) tools for integration into our workflows, + +**As:** a DevSecOps Engineer Team Lead, + +**I want:** + +- To conduct a market overview of available SBOM tools. +- Test and evaluate SBOM solutions through demos within our Azure DevOps environment. +- Build and document reusable pipeline templates for SBOM generation and validation. + +**This ensures:** + +- Compliance with increasing customer demands for SBOM capabilities. +- Streamlined implementation of SBOM generation in our DevOps pipelines. +- Improved security and transparency of our software supply chain. (insofern wir selber Software bereitstellen) + +### Acceptance Criteria + +1. Market Overview: + + - A comprehensive list of SBOM tools and their key features, including license and approx costs (free, open source, payed, enterprise size costs > kostenlos, vertretbar, arschteuer) + - git repo docs-onboarding, neue sbom.md datei + +2. Testing & Evaluation: + + - Successful deployment and execution of SBOM tools in our Azure DevOps environment. + - Demos conducted for at least 3 shortlisted SBOM solutions. + +3. Pipeline Templates: + + - Creation of reusable pipeline templates for SBOM generation in Azure DevOps. + - Inclusion of relevant metadata, such as Licenses, CVEs etc. + - git repo cicd-pipeline-library, new sub-folder "sbom", ment-bold.yml verschieben in den neuen Ordner + +4. Documentation: + + - Step-by-step guide for integrating selected SBOM tools in Azure DevOps pipelines alongside cicd template + - Example configurations if possible + +5. Training and Adoption: + + - Team participation in at least one SBOM-related training webinar (e.g., Cybellum Technologies SBOM Webinar) > schau mal, ob du 2 oder 3 Webinars findest, die sinnvoll sind und an denen wir teilnehmen können + - Internal presentation summarizing findings and providing guidance for SBOM adoption > Präsentation bei einer der kommenden XWare GLs im Bereich Know How zu Beginn + +## Market Overview + +Most used from this list: https://spdx.dev/use/spdx-tools/ + +| Name and Link | Key Features | License | Approx Costs | +| ------------- | ------------ | ------- | ------------ | +| [Microsoft's SBOM Tool](https://github.com/microsoft/sbom-tool) |
  • **SBOM Generation**: Scans source folders for dependencies and generates SBOMs.
  • **CI/CD Integration**: Seamless integration with GitHub Actions and Azure DevOps.
  • **Validation**: Validates SBOMs and redacts sensitive data.
| MIT | Open Source | +| [Syft](https://github.com/anchore/syft) |
  • **SBOM Creation**: Builds SBOMs for containers, files, and cloud artifacts.
  • **Multiple Formats**: Supports SPDX and CycloneDX.
  • **Ecosystem Integration**: Compatible with Anchore's other tools for security analysis.
| Apache-2.0 | Open Source | +| [ScanCode Toolkit](https://github.com/nexB/scancode-toolkit) |
  • **License Detection**: Scans for open-source licenses and copyrights.
  • **Component Identification**: Identifies components, vulnerabilities, and origin data.
  • **Customizable**: Extensible with plugins and tailored scanning options. | Apache-2.0 | Open Source | +| [SCANOSS](https://www.scanoss.com) |
    • **Real-Time Scanning**: Detects open-source components during development.
    • **Comprehensive Detection**: Uses an extensive database for accurate results.
    • **APIs for Integration**: Offers APIs for workflow integration.
    | Proprietary | Free, $35K/year, Custom | +| [Vigilant Ops](https://www.vigilant-ops.com) |
    • **SBOM Management**: Manages and tracks SBOMs for transparency.
    • **Vulnerability Analysis**: Identifies risks in software components.
    • **Compliance Tools**: Ensures adherence to industry standards.
    | Proprietary | Unknown | +| [Threatrix](https://threatrix.io) |
    • **SCA Analysis**: Monitors and analyzes software components.
    • **Real-Time Updates**: Detects emerging vulnerabilities.
    • **Detailed Reporting**: Helps manage security and compliance risks.
    | Proprietary | Unknown | +| [Black Duck](https://www.blackduck.com) |
    • **Component Insights**: Tracks open-source licenses and vulnerabilities.
    • **Policy Automation**: Creates and enforces usage policies.
    • **Continuous Monitoring**: Monitors for new threats and compliance issues.
    | Proprietary | Unknown | +| [OSS Review Toolkit](https://oss-review-toolkit.org) |
    • **Dependency Scanning**: Automates open-source dependency analysis.
    • **Policy Evaluation**: Ensures compliance with organizational policies.
    • **CI/CD Integration**: Fits into existing pipelines.
    | Apache-2.0 | Open Source | +| [Manifest](https://www.manifestcyber.com) |
    • **SBOM Tools**: Manages and generates SBOMs for software.
    • **Vulnerability Scans**: Identifies risks in the supply chain.
    • **Compliance Support**: Helps meet regulatory standards.
    | Proprietary | Unknown | +| [Lib4SBOM](https://github.com/anthonyharrison/lib4sbom) |
    • **Library for SBOMs**: Simplifies SBOM creation in various formats.
    • **Standard Support**: Compatible with SPDX and CycloneDX.
    • **Development Friendly**: Easy integration with workflows.
    | Apache-2.0 | Open Source | +| [GUAC](https://guac.sh) |
    • **SBOM Aggregation**: Consolidates SBOMs into a unified graph.
    • **Provenance Tracking**: Tracks the origin of software components.
    • **Querying**: Provides deep insights into dependencies.
    | Apache-2.0 | Open Source | +| [FOSSology](https://www.fossology.org) |
    • **License Scanning**: Detects and analyzes software licenses.
    • **Metadata Extraction**: Extracts copyright and component details.
    • **Custom Workflows**: Supports flexible compliance processes.
    | GPL-2.0 / LGPL-2.1 | Open Source | +| [DISTRO2SBOM](https://github.com/anthonyharrison/distro2sbom) |
    • **Distribution Focused**: Creates SBOMs for Linux distributions.
    • **Comprehensive Scans**: Analyzes all installed packages.
    • **Standards Compatible**: Supports SPDX and CycloneDX formats.
    | Apache-2.0 | Open Source | +| [CycloneDX](https://github.com/CycloneDX) |
    • **SBOM Standard**: Defines a standardized SBOM format.
    • **Extensive Tooling**: Libraries and tools for CycloneDX SBOMs.
    • **Broad Adoption**: Industry-standard for supply chain transparency.
    | Apache-2.0 | Open Source | +| [CAST SBOM Manager](https://www.castsoftware.com/sbommanager) |
    • **Centralized Management**: Manages SBOMs from various tools.
    • **Vulnerability Tracking**: Monitors components for security issues.
    • **Compliance Features**: Generates reports for regulatory requirements.
    | Proprietary | Free | +| [Dependency Track](https://dependencytrack.org) |
    • **Continuous Analysis**: Analyzes SBOMs for vulnerabilities.
    • **Ecosystem Integration**: Works with CycloneDX SBOMs.
    • **Comprehensive Monitoring**: Tracks components for new risks.
    | Apache-2.0 | Open Source | +| [Trivy](https://trivy.dev) |
    • **Vulnerability Scanning**: Scans containers, dependencies, and code.
    • **SBOM Support**: Generates and analyzes SBOMs.
    • **Broad Compatibility**: Works across multiple platforms and CI/CD tools.
    | Apache-2.0 | Open Source | +| [Parlay](https://github.com/snyk/parlay) |
    • **SBOM Enhancements**: Improves and consolidates SBOM data.
    • **Integration Ready**: Supports Snyk tools and others.
    • **Scalability**: Handles large-scale SBOMs efficiently.
    | Apache-2.0 | Open Source | +| [Finite State](https://finitestate.io) |
    • **SBOM Automation**: Automates SBOM creation and management.
    • **Vulnerability Analysis**: Identifies and mitigates risks.
    • **Compliance Features**: Meets regulatory requirements.
    | Proprietary | Unknown | +| [Checkmarx](https://checkmarx.com/product/sbom/) |
    • **SBOM Creation**: Generates SBOMs with detailed component analysis.
    • **Security Focus**: Prioritizes identifying vulnerabilities.
    • **Policy Compliance**: Ensures adherence to internal policies.
    | Proprietary | Unknown | +| [Qwiet](https://qwiet.ai) |
    • **Real-Time Scans**: Monitors open-source components during CI/CD.
    • **AI-Driven Analysis**: Leverages AI for threat detection.
    • **Comprehensive Reporting**: Details vulnerabilities and compliance.
    | Proprietary | Unknown | +| [Snyk](https://snyk.io) |
    • **SBOM Support**: Integrates SBOM generation with its security tools.
    • **Vulnerability Scans**: Identifies threats in open-source and proprietary code.
    • **Policy Compliance**: Assists in maintaining secure supply chains.
    | Proprietary | Unknown | +| [SBOM Observer](https://sbom.observer) |
    • **Visualization**: Visualizes SBOM data for better understanding.
    • **Collaboration**: Designed for team use with access controls.
    • **Multi-Tier Plans**: Offers flexible subscription options
    | Proprietary | €49/user/month, €69/user/month, Custom | +| [SOOS](https://soos.io) |
    • **Affordable Security**: Provides low-cost vulnerability analysis.
    • **SBOM Tools**: Creates and manages SBOMs efficiently.
    • **Developer Focus**: Tailored for small to medium teams.
    | Proprietary | $0/month, $90/month, Custom | + +## Testing & Evaluation + +| Name and Link | Result | +| ------------- | ------ | +| [Microsoft's SBOM Tool](https://github.com/microsoft/sbom-tool) | Simple and easy to install and use. Very good result. Every package recognized including licenses and vulnerabilities information. With [SBOM Tool Azure DevOps Extension](https://marketplace.visualstudio.com/items?itemName=rhyskoedijk.sbom-tool) very nice graphical processing what is directly integradted in pipline log. | +| [Syft](https://github.com/anchore/syft) | Simple and easy to install and use. Poor result. Packages not recognized but binaries. Multiple duplicates. Difficult to evaluate the result. No license information. No graphical processing provided. | +| [ScanCode Toolkit](https://github.com/nexB/scancode-toolkit) | Simple and easy to install and use. Poor result. Packages not recognized but binaries. No licenses and vulnerabilities information. Difficult to evaluate the result. Graphical processing provided with external tool. | + +Further tests are therefore carried out with [Microsoft's SBOM Tool](https://github.com/microsoft/sbom-tool). + +## Pipeline Templates + +With [SBOM Tool Azure DevOps Extension](https://marketplace.visualstudio.com/items?itemName=rhyskoedijk.sbom-tool) a simple call as task with all needed parameters already exists. Therefore no template is required. + +## Documentation + +### Install Extension + +Appropriate permissions or an authorization are required for the installation of [SBOM Tool Azure DevOps Extension](https://marketplace.visualstudio.com/items?itemName=rhyskoedijk.sbom-tool). + +## Use in pipeline + +After installation a task in the pipeline can look like the following example: + +```yaml +- task: sbom-tool@1 + displayName: 'Generate SBOM Manifest' + inputs: + command: 'generate' + buildSourcePath: '$(Build.SourcesDirectory)' + buildArtifactPath: '$(Build.ArtifactStagingDirectory)' + enableManifestSpreadsheetGeneration: true + enableManifestGraphGeneration: true + enablePackageMetadataParsing: true + fetchLicenseInformation: true + fetchSecurityAdvisories: true + gitHubConnection: 'GitHubForSandbox' + packageSupplier: 'MyOrganisation' + packageName: 'MyPackage' + packageVersion: '$(Build.BuildNumber)' +``` + +A complete example: + +```yaml +jobs: + - job: publish + steps: + - task: DotNetCoreCLI@2 + displayName: 'Publish project' + inputs: + command: 'publish' + publishWebProjects: true + arguments: '--output "$(Build.ArtifactStagingDirectory)"' + + - task: sbom-tool@1 + displayName: 'Generate project SBOM manifest' + inputs: + command: 'generate' + buildSourcePath: '$(Build.SourcesDirectory)' + buildArtifactPath: '$(Build.ArtifactStagingDirectory)' + enableManifestSpreadsheetGeneration: true + enableManifestGraphGeneration: true + enablePackageMetadataParsing: true + fetchLicenseInformation: true + fetchSecurityAdvisories: true + gitHubConnection: 'GitHub Advisory Database Connection' + packageSupplier: 'MyOrganisation' + packageName: 'MyPackage' + packageVersion: '$(Build.BuildNumber)' + + - task: PublishBuildArtifacts@1 + displayName: 'Publish artifacts' + inputs: + PathtoPublish: '$(Build.ArtifactStagingDirectory)' + ArtifactName: 'drop' + publishLocation: 'Container' +``` + +## Training and Adoption + +Possible Webinars: + +- https://jfrog.com/webinar/creation-of-your-software-bill-of-materials-sbom/ +- https://www.mend.io/resources/webinars/sboms-a-critical-tool-for-modern-organizations/ +- https://www.medcrypt.com/private/webinars-conferences/i-have-an-sbom-now-what +- https://openchainproject.org/news/2024/10/01/coming-soon-webinar-on-sbom-visualization +- https://www.cybeats.com/blog/5-key-takeaways-from-microsoft-and-googles-webinar-on-sbom diff --git a/stages.md b/stages.md new file mode 100644 index 0000000..2a8f508 --- /dev/null +++ b/stages.md @@ -0,0 +1,11 @@ +# Stages + +**Ampere (Traffic Router):** The conductor ensuring that the flow of development, testing, and deployment processes is directed to the correct stage efficiently and effectively. + +1. Volt (Development): This is the initial stage where new features and fixes are developed and tested. It's the foundation for your application, similar to how the volt is a fundamental unit of electrical potential. + +2. Var (Staging): In this stage, code is rigorously tested in an environment that mimics the production setting. Var, the unit for reactive power, resonates with this stage's role in ensuring that the system will react effectively under various conditions. + +3. Watt (Production): The final stage where the application is live and accessible to end-users. Named after the unit of power, this stage is where the system's full capabilities are utilized. + +This naming scheme maintains a consistent theme while capturing the essence of each stage. diff --git a/traceability-concept.md b/traceability-concept.md new file mode 100644 index 0000000..5731277 --- /dev/null +++ b/traceability-concept.md @@ -0,0 +1,23 @@ +# Traceability Concept + +![Traceability Concept](resources/diagrams/traceability-concept.png) + +## Description of the layers + +**Stakeholder Requirements**: It's crucial to understand that we always start with the stakeholder requirements where regulations are an important input. This is the customer's point of view, defines “what” should be done. + +**System Requirements**: The system requirements are the answer to the stakeholder requirements. It describes “how” it is going to be implemented. Quality requirements, often called non-functional requirements, are part of the system requirements. They are stated as testable requirements e.g. latency time must be less than 200ms. Furthermore we also add potential risks at this level. + +**Functional Specification**: ... to be continued + +**Configuration Specification**: ... to be continued + +## Implementation in Azure DevOps + +We are following the CMMI process template in Azure DevOps. + +Regulations are mapped as requirements work item in Azure DevOps that affect other requirements. In that way we see all regulations that affect a requirement and all requirements that are affected by a regulation. This is a bidirectional traceability. + +All requirements are listed in a use case specific document. The requirements are linked to the regulations and the tests. The tests are linked to the requirements. This is a bidirectional traceability. + +This results in a traceability matrix that documents the requirements from the regulation to testing. diff --git a/versioning.md b/versioning.md index 1e98b50..ec6b637 100644 --- a/versioning.md +++ b/versioning.md @@ -64,3 +64,23 @@ bash resources/scripts/release-notes.bash ``` As we don't want to rely 100% on the script output and want to have the possibility to add some manual notes, we will add the release notes to the `CHANGELOG.md` file. + +To have the script ad hand whenever you need it, you can add the folder where it is located to your `PATH` environment variable. This is an example of the Ansible snipped for macOS based systems assuming you have git cloned this repository to your home directory under `Development` subfolder: + +```yaml +- name: Add release-notes script to path + lineinfile: + path: "/Users/{{ macos_user }}/.zshrc" + line: 'export PATH=$PATH:/Users/{{ macos_user }}/Development/docs-onboarding/resources/scripts' + state: present +``` + +And this would be the equivalent for Windows: + +```yaml +- name: Add release-notes script to path + win_path: + elements: + - C:\Users\{{ windows_user }}\Development\docs-onboarding\resources\scripts + state: present +``` diff --git a/vms-and-lxcs.md b/vms-and-lxcs.md new file mode 100644 index 0000000..a493682 --- /dev/null +++ b/vms-and-lxcs.md @@ -0,0 +1,97 @@ +# Basic Know-How about VMs and LXC containers + +## Microsft Azure compatible Linux Distributions + +We love Debian, but it's not well supported by Microsoft Azure. Thus we are once in a while going mainstream and use Ubuntu, if VMs must be connected to Azure. + +## User Management + +- Limit root login to console only as an emergency fallback. You can login via the proxmox console then. +- Create a `ansible` super user with sudo rights; allow SSH access by keys only. This is used for maintenance and configuration. +- Create a normal user `debian` with restricted privileges; also allow SSH by keys only. This one can be used for normal system tasks. +- Disallow password-based SSH logins for all users besides root. +- Periodically review SSH logs for unauthorized access attempts. + +## IaC vs. CaC + +Infratsructure as Code (IaC) is how we deploy virtual bare-metal. We are using Terraform for that. + +Configuration as Code (CaC) is how we configure the VMs and LXC containers and install software. We are using Ansible for that. + +We are defining the boundary between IaC and CaC as follows: + +- IaC is responsible for the VMs and LXC containers, the network, and the storage. +- IaC ends as soon as the VMs and LXC containers are up and running. +- SSH keys are installed by IaC. +- CaC is responsible for the software installed on the VMs and LXC containers. +- CaC uses the SSH keys installed by IaC to connect to the VMs and LXC containers. + +## IaC Terraform Proxmox Provider + +The [Proxmox Terraform Provider](https://github.com/Telmate/terraform-provider-proxmox) is not mature enough now. Thus we use [Proxmox VE Helper-Scripts](https://community-scripts.github.io/ProxmoxVE/scripts). + +The following text in this chapter are notes and references in case the provider gets more mature and we switch in the future. + +Our hypervisor is Proxmox, which is based on Debian. We are using the [Proxmox cloud-init](https://pve.proxmox.com/wiki/Cloud-Init_Support) template for Ubuntu. + +We are using a small server images to keep the attack surface small. The cloud-init template is a server Ubuntu image with cloud-init installed. Get the URL from the Ubuntu website and download it to the Proxmox servers local storage for ISO images. Ubuntu website link: [https://cloud-images.ubuntu.com/releases/](https://cloud-images.ubuntu.com/releases/). We are going for the file ending in `*server-cloudimg-amd64.img`. + +We are also using Ubuntu for the LXC containers. We are using the latest Ubuntu standard LXC template you can download via the Proxmox web interface for that. + +## SSH keys + +SSH keys are managed via the approach described in the [infra-terraform-sshkeyvault](https://xwr.visualstudio.com/jambor.pro/_git/infra-terraform-sshkeyvault) repository. As of now we create them one by one with the provided scripts. + +## Create an LXC container + +- Make use of the [Azure Naming Tool](https://app-azurenamingtool-dev-bnhfgbdgafeqh2gf.switzerlandnorth-01.azurewebsites.net/) to get a suitable name for the LXC container. We use the same schema as for virtual machines. E.g. `vm-mal-dev-opr-1` +- Create a new ssh key according to the [infra-terraform-sshkeyvault](https://xwr.visualstudio.com/jambor.pro/_git/infra-terraform-sshkeyvault) repository. Use a name from the naming tool, e.g. `kvs-mal-dev-opr-1` + +- Search for a pre-defined template or the latest Debian / Ubuntu empty template: [Proxmox VE Helper-Scripts](https://community-scripts.github.io/ProxmoxVE/scripts) +- Review the script and check that you understand it and no malicious code is in it. (ha ha, we all do that, right?) +- Execute the script on the Proxmox servers shell via the web interface. SSH is not advised for that. +- Use advanced settings like the example below. + +```bash + 🧩 Using Advanced Settings on node prd-proxmox-2 + 🖥️ Operating System: debian + 🌟 Version: 12 + 📦 Container Type: Unprivileged + 🔐 Root Password: ******** + 🆔 Container ID: 101 + 🏠 Hostname: vm-mal-dev-opr-1 + 💾 Disk Size: 64 GB + 🧠 CPU Cores: 1 + 🛠️ RAM Size: 2048 MiB + 🌉 Bridge: vmbr0 + 📡 IP Address: dhcp + 🌐 Gateway IP Address: Default + 📡 APT-Cacher IP Address: Default + 🚫 Disable IPv6: yes + ⚙️ Interface MTU Size: Default + 🔍 DNS Search Domain: Host + 📡 DNS Server IP Address: Host + 🏷️ Vlan: 7 + 📡 Tags: ; + 🔑 Root SSH Access: yes + 🔍 Verbose Mode: yes +``` + +- **Important:** add the public ssh key to the LXC in the process to enable ssh via key. +- If the service is exposing an http(s) service, put traefik infront of it if you want to access it from external. See [Proxmox VE Helper-Scripts](https://community-scripts.github.io/ProxmoxVE/scripts) for examples. + +If you cannot choose Ubuntu as distribution, and you must connect the VM to Azure you should choose to create an empty Ubunto LXC and install the desired service on top of that. + +- Create a LXC within the Proxmox web interface and use the latest Ubuntu LTS template. +- **Important networking note** using IPv6 dhcp causes the network to stop working as the lease seems not to be updated. Keep IPv6 as static, IPv4 can be dhcp. +- Ensure to set the right vnet ID according to [networking instructions](network.md). +- ssh into the LXC container making use of the ssh key. +- Install waht ever you need to install. Preferably use Ansible for that. + +## Create a VM + +- ... + +## Add new resource to Ansible repository + +We are maintaining VMs and LXCs with Ansible. Add the newly created VM or LXC to the Ansible inventory [infra-ansible-serverconfiguration](https://xwr.visualstudio.com/jambor.pro/_git/infra-ansible-serverconfiguration). diff --git a/welcome.md b/welcome.md index 8c5ad41..6768dac 100644 --- a/welcome.md +++ b/welcome.md @@ -1,7 +1,3 @@ # Welcome to the team -Welcome message and introduction to the wiki. - -## Team Structure - -Breakdown of the team's structure and roles. +Welcome to the XWare Azure DevOps documentation repository. This repository serves as a comprehensive resource for all things related to DevOps practices within our organization. Here, you will find detailed guides, best practices, and documentation to help you navigate and excel in your DevOps journey.