diff --git a/analysis/README.md b/analysis/README.md index ca3dff5..dc2f65c 100644 --- a/analysis/README.md +++ b/analysis/README.md @@ -1,48 +1,71 @@ # CNCF TechDocs Analysis for OSS Projects -This directory contains: +## Purpose -- Analyses of the technical documentation for selected CNCF incubating and graduated software projects. -- Tools (templates, analysis criteria, background information) to enable a mid- to senior-level technical writer to perform an analysis independently with some support from the CNCF tech docs staff. +The goals of a CNCF technical documentation analysis are to: -## Project Analyses +- Examine the current project technical documentation and website against the CNCF's analysis framework, as described in the doc analysis [criteria](./criteria.md). +- Compare the documentation against the current or proposed maturity level for the overall project. +- Recommends a program of key improvements with the largest return on investment. These improvements are documented as *recommendations* in the analysis document and expanded in a companion [implementation plan](./implementation-template.md) and [issues backlog](./umbrella-issue-template.md). -There are two rounds of projects: +## Contents -1. Analyses **0001** - **0007** are a first round of projects completed as "assessments" through the CNCF Help Desk. The file `000N-projectname.md` file is the sole artifact of the assessment in each case. The last one was added in May 2023. -2. Subsequent analyses were commissioned starting in November 2023. Each has its own directory, `00NN-projectname`, containing three analysis artifacts: - - `projectname-analysis.md` evaluates the project documentation and provides comments and recommendations in a manner very similar to the first round of tech doc assessments. This document is based on the analysis template and accompanying criteria developed for the first round. +In this directory: + +- **Project Analyses**: `analysis` contains analyses of the technical documentation for selected CNCF incubating and graduated software projects. +- **Analysis Tools**: `analysis-tools` contains instructions, templates, analysis criteria, and background information to enable a mid- to senior-level technical writer to perform an analysis independently with some support from the CNCF tech docs staff. + +### Project Analyses + +Completed analyses are contained in the `analysis` directory. + +There are two rounds of projects, *Round 1* and *Round 2*. + +#### Round 1 + +Analyses **0001** - **0007** are a first round of projects completed as "assessments" through the CNCF Help Desk. The `000N-projectname.md` file is the sole artifact of the assessment in each case. The last one was added in May 2023. + +#### Round 2 + +Subsequent analyses were commissioned starting in November 2023. Each has its own directory, `00NN-projectname`, containing three analysis artifacts: + - `projectname-analysis.md` evaluates the project documentation and provides comments and recommendations in a manner very similar to the Round 1 tech doc assessments. This document is based on the analysis template and accompanying criteria developed for the Round 1. - `projectname-implementation.md` provides concrete actions that can be implemented by project contributors. Its focus is on specific, achievable work that will have a strong positive impact on document effectiveness. - `projectname-issues.md` is a backlog of improvement actions, meant to be entered as GitHub Issues, derived from `projectname-implementation.md`. -## Analyst Tools +### Analysis Tools -Read and follow the guidelines in the `analyst-tools` directory to perform a documentation analysis for CNCF. These guidelines provide: -- A relatively objective set of criteria (a "rubric") for evaluating technical documentation. -- An attempt the documentation analysis to the current (or proposed) maturity level for the overall project. -- A consistent set of criteria on which to evaluate existing documentation and website content, infrastructure, and support. -- Emphasis on effective documentation that serves all users associated with the project. +Templates and instructions for doing the analyses are contained in the `analysis-tools` directory. -### How-to +#### Audience -This document contains a general discussion of the CNCF Tech Docs analysis program and instructions for requesting, writing, and consuming an analysis. +This directory is for primarily for members of the CNCF TechDocs team, including contractors or consultants, who need to conduct or assist with an analysis of a CNCF open-source project's technical documentation. Readers in other roles can also benefit from understanding the guidelines in this directory: -### Criteria +- **Project maintainers** can learn how improved technical documentation can increase the effectiveness of the project software, speed adoption, and improve user satisfaction. +- **CNCF Foundation members** can learn what benefits can (and cannot) be expected of a documentation improvement effort. +- **Members of other open-source software foundations** can use the analysis tools as a model, in whole or in part, for their own documentation improvement processes. (Please contact the Cloud Native Computing Foundation to discuss licensing and permission.) +- **Project contributors** can learn what factors go into improving technical documentation and what is expected of contributors who work on project documentation issues. -This document describes the criteria used to evaluate a project's technical documentation and website. These criteria are also referred to as a "rubric" elsewhere in this repo. +#### Contents -### Templates +Use the guidelines and templates in this directory to perform a documentation analysis for CNCF. These materials provide: +- A relatively objective set of criteria (a "rubric") for evaluating existing documentation and website content, infrastructure, and support. +- An attempt to make the documentation analysis appropriate to the current (or proposed) maturity level for the overall project. +- Emphasis on effective documentation that serves all users associated with the project. -These are templates for the analysis artifacts. +##### How-to -#### Analysis Template +`howto.md` contains instructions for requesting, writing, and consuming an analysis. -This is the main analysis template, based on the work of the original 2021-23 tech docs assessments. +##### Criteria -#### Implementation Plan +`criteria.md` describes the criteria used to evaluate a project's technical documentation and website. These criteria are also referred to as a "rubric" elsewhere in this repo. The criteria are unchanged between the first and second round of analyses. -The implementation plan is an intermediate step between the analysis and the issues backlog, meant as an aid to digesting the analysis recommendations into actionable issues. +##### Templates -#### Issues +These are templates for the analysis artifacts. -This is the final backlog of recommended changes to the documentation, meant to be transferred more or less directly into the GitHub Issues of the project documentation repo. \ No newline at end of file +- **Analysis Template**: `analysis-template.md` is the main analysis template, based on the work of the original 2021-23 tech docs assessments. +- **Implementation Plan**: The implementation plan, represented in `implementation-template.md`, is an intermediate step between the analysis and the issues backlog, meant as an aid to digesting the analysis recommendations into actionable issues. +- **Issues**: This is the final backlog of recommended changes to the documentation, meant to be transferred directly into the GitHub Issues of the project documentation repo. There are two templates: + - `issue-template.md` is a template for individual issues that can be use to create issues in GitHub. + - `umbrella-issue-template.md` can be used to create an umbrella issue in GitHub, and can also be used as a template for a `_PROJECT_-issues.md` document to be included in the analysis pull request. \ No newline at end of file diff --git a/analysis/analysis-tools/New CNCF tech doc artifacts.md b/analysis/analysis-tools/New CNCF tech doc artifacts.md deleted file mode 100644 index 90e27e2..0000000 --- a/analysis/analysis-tools/New CNCF tech doc artifacts.md +++ /dev/null @@ -1,54 +0,0 @@ -# New CNCF tech doc artifacts - -## Existing - -How-to -Template -Criteria - -1. Revise template to match what I've done -2. Explain how to build out the umbrella and sub-issues -3. Create another folder for the process - -## Artifacts: -README: preamble, theory, objectives, audience, lay out the program. Then link to: - -How-to: checklist or recipe. CNCF requirements -- what does Nate need? - -Template: update. - -Criteria: how do you evaluate? - -Scoring/ranking - purposes: -- Appearance of objectivity -- Maturity level of software -- Shame maintainers into taking doc seriously -- Expected - -Scoring: needed or not? - -| Reason | Importance | Notes | -| --- | --- | --- | -| Objectivity | | Gives at least the appearance of an objective evaluation (that's why I started using the term "rubric", btw) | -| Objectivity, appearance of | | Gives at least the appearance of an objective evaluation (that's why I started using the term "rubric", btw) | -| Shame maintainers | | | -| Match software maturity | | Agree that for incubating and graduated projects | -| Expected | | Quantitative analysis vs. Qualitative? | -| | | | - -Evaluate on effort vs impact? - -Documentation for Developers - -## Audiences: -1. Maintainers - what's in it for me? -2. Analyst - doing the analysis work - what is "done"? -3. Foundation (CNCF) - keep the customer satisfied - should address expectation. -4. Foundation (other) - want to be able to use as a model for other orgs. -5. Contributors - writers who pick up the work -- need to define what we expect of these people. - -## Repo Outline - -- **cncf-techdocs** - - **docs**: Documentation - - \ No newline at end of file diff --git a/analysis/analysis-tools/analysis-template.md b/analysis/analysis-tools/analysis-template.md new file mode 100644 index 0000000..6d2cbd3 --- /dev/null +++ b/analysis/analysis-tools/analysis-template.md @@ -0,0 +1,476 @@ +--- +title: _PROJECT_ Documentation Analysis +tags: _PROJECT_ +created: YYYY-MM-DD +modified: YYYY-MM-DD +author: _NAME_ (@_HANDLE_) +--- + + + +# Introduction + +This document analyzes the effectiveness and completeness of the [_PROJECT_][project-website] open source software (OSS) project's documentation and website. It is funded by the CNCF Foundation as part of its overall effort to incubate, grow, and graduate open source cloud native software projects. + +According to CNCF best practices guidelines, effective documentation is a prerequisite for program graduation. The documentation analysis is the first step of a CNCF process aimed at assisting projects with their documentation efforts. + +## Purpose + +This document was written to analyze the current state of _PROJECT_ documentation. It aims to provide project leaders with an informed understanding of potential problems in current project documentation. A second document, `_PROJECT_-impementation.md`, outlines an actionable plan for improvement. A third document, `_PROJECT_-issues.md`, enumerates a backlog of issues to be added to the project documentation repository. These issues can be taken up by contributors to improve the documentation. + +This document: + +- Analyzes the current _PROJECT_ technical documentation and website +- Compares existing documentation against the CNCF’s standards +- Recommends a program of key improvements with the largest return on investment + +## Scope of analysis + +The documentation discussed here includes the entire contents of the website, the technical documentation, and documentation for contributors and users on the _PROJECT_ GitHub repository. + +The _PROJECT_ website and documentation are written in [Markdown, ReStructured Text, other] and are compiled using the [Hugo, Docusaurus, Sphynx, other] static site generator with the [Docsy, other] theme and served from [the Netlify platform, other]. The site's code is stored on the _PROJECT_ GitHub repo. + +**In scope:** +- Website: _PROJECT-WEBSITE_ +- Documentation: _PROJECT-DOC-URL_ +- Website repo: _PROJECT-DOC-REPO_ +- _[Other; might include a demo server, governance site, or other relevant repositories]_ + +**Out of scope:** +- Other _PROJECT_ repos: _[In general, do not include sub-projects or related "ecosystem" projects]_ + + +## How this document is organized + +This document is divided into three sections that represent three major areas of concern: + +- **Project documentation:** concerns documentation for users of the _PROJECT_ software, aimed at people who intend to use the project software +- **Contributor documentation:** concerns documentation for new and existing contributors to the _PROJECT_ OSS project +- **Website:** concerns the mechanics of publishing the documentation, and includes branding, website structure, and maintainability + +Each section begins with summary ratings based on a rubric with appropriate [criteria][criteria-doc] for the section, then proceeds to: +- **Comments**: observations about the existing documentation, with a focus on how it does or does not help _PROJECT_ users achieve their goals. +- **Recommendations**: suggested changes that would improve the effectiveness of the documentation. + +An accompanying document, [`_PROJECT_-implementation.md`][implementation-doc], breaks the recommendations down into concrete actions that can be implemented by project contributors. Its focus is on drilling down to specific, achievable work that can be completed in constrained blocks of time. Ultimately, the implementation items are decomposed into a series of [issues][issues-doc] and entered as GitHub [issues][project-doc-website]/issues. + + +## How to use this document + +Readers interested only in actionable improvements should skip this document and read the [implementation plan][implementation-doc] and [issues list][issues-doc]. + +Readers interested in the current state of the documentation and the reasoning behind the recommendations should read the section of this document pertaining to their area of concern: + +- [Project documentation][project-heading] +- [Contributor documentation][contributor-heading] +- [Website and documentation infrastructure][website-heading] + +Examples of CNCF documentation that demonstrate the analysis criteria are linked from the [criteria][criteria-doc] specification. + + +### Recommendations, requirements, and best practices + +This analysis measures documentation against CNCF project maturity standards, and suggests possible improvements. In most cases there is more than one way to do things. Few recommendations here are meant to be prescriptive. Rather, the recommended implementations represent the reviewers' experience with how to apply documentation best practices. In other words, borrowing terminology from the lexicon of [RFCs][rfc-spec], the changes described here should be understood as "recommended" or "should" at the strongest, and "optional" or "may" in many cases. Any "must" or "required" actions are clearly denoted as such, and pertain to legal requirements such as copyright and licensing issues. + + +# Project documentation + + +_PROJECT_ is a **graduated** project of CNCF. This means that the project should have [*very high*][criteria-doc] standards for documentation. + +_PROJECT_ is an **incubating** project of CNCF. This means that the project should be [*developing*][criteria-doc] professional-quality documentation alongside the project code. + +| Criterion | Rating (1-5) | +| --- | --- | +| Information architecture | (rating value) | +| New user content | (rating value) | +| Content maintainability | (rating value) | +| Content creation processes | (rating value) | +| Inclusive language | (rating value) | + + + +## Comments + + + +The following sections contain brief assessments of each element of the Project Documentation rubric. + + + +### Information architecture + +The overall structure (pages/subpages/sections/subsections) of your project documentation. We evaluate on the following: + +* Is there high level conceptual/“About” content? +Is the documentation feature complete? (i.e., each product feature is documented) +* Are there step-by-step instructions (tasks, tutorials) documented for features? +* Are there any key features which are documented but missing task documentation? +* Is the “happy path”/most common use case documented? +Does task and tutorial content demonstrate atomicity and isolation of concerns? (Are tasks clearly named according to user goals?) +* If the documentation does not suffice, is there a clear escalation path for users needing more help? (FAQ, Troubleshooting) +* If the product exposes an API, is there a complete reference? +* Is content up to date and accurate? + +### New user content + +New users are the most avid users of documentation, and need content specifically for them. We evaluate on the following: + +* Is “getting started” clearly labeled? (“Getting started”, “Installation”, “First steps”, etc.) +* Is installation documented step-by-step? +* If needed, are multiple OSes documented? +* Do users know where to go after reading the getting started guide? +* Is your new user content clearly signposted on your site’s homepage or at the top of your information architecture? +* Is there easily copy-pastable sample code or other example content? + + +### Content maintainability & site mechanics + +As a project scales, concerns like localized (translated) content and versioning become large maintenance burdens, particularly if you don’t plan for them. + +We evaluate on the following: + +* Is your documentation searchable? +* Are you planning for localization/internationalization with regards to site directory structure? Is a localization framework present? +* Do you have a clearly documented method for versioning your content? + + +### Content creation processes + +Documentation is only as useful as it is accurate and well-maintained, and requires the same kind of review and approval processes as code. + +We evaluate on the following: + +* Is there a clearly documented (ongoing) contribution process for documentation? +* Does your code release process account for documentation creation & updates? +* Who reviews and approves documentation pull requests? +* Does the website have a clear owner/maintainer? + + +### Inclusive language + +Creating inclusive project communities is a key goal for all CNCF projects. + +We evaluate on the following: + +* Are there any customer-facing utilities, endpoints, class names, or feature names that use non-recommended words as documented by the [Inclusive Naming Initiative](https://inclusivenaming.org) website? +* Does the project use language like "simple", "easy", etc.? + + +## Recommendations + + + +### Information architecture + +### New user content + +### Content maintainability & site mechanics + +### Content creation processes + +### Inclusive language + + +# Contributor documentation + + +_PROJECT_ is a **graduated** project of CNCF. This means that the project should have [*very high*][criteria-doc] standards for documentation. + +_PROJECT_ is an **incubating** project of CNCF. This means that the project should be [*developing*][criteria-doc] professional-quality documentation alongside the project code. + +| Criterion | Rating (1-5) | +| --- | ----------------- | +| Communication methods documented | (rating value) | +| Beginner friendly issue backlog | (rating value) | +| “New contributor” getting started content | (rating value) | +| Project governance documentation | (rating value) | + + + + +## Comments + + + +The following sections contain brief assessments of each element of the Contributor Documentation rubric. + + + +### Communication methods documented + +One of the easiest ways to attract new contributors is making sure they know how to reach you. + +We evaluate on the following: + +* Is there a Slack/Discord/Discourse/etc. community and is it prominently linked from your website? +* Is there a direct link to your GitHub organization/repository? +* Are weekly/monthly project meetings documented? Is it clear how someone can join those meetings? +* Are mailing lists documented? + +### Beginner friendly issue backlog + +We evaluate on the following: + +* Are docs issues well-triaged? +* Is there a clearly marked way for new contributors to make code or documentation contributions (i.e. a “good first issue” label)? +* Are issues well-documented (i.e., more than just a title)? +* Are issues maintained for staleness? + +### New contributor getting started content + +Open source is complex and projects have many processes to manage that. Are processes easy to understand and written down so that new contributors can jump in easily? + +We evaluate on the following: + +* Do you have a community repository or section on your website? +* Is there a document specifically for new contributors/your first contribution? +* Do new users know where to get help? + +### Project governance documentation + +One of the CNCF’s core project values is open governance. + +We evaluate on the following: + +* Is project governance clearly documented? + +## Recommendations + + + +### Communication methods documented + +### Beginner friendly issue backlog + +### New contributor getting started content + +### Project governance documentation + + +# Website and infrastructure + + +_PROJECT_ is a **graduated** project of CNCF. This means that the project should have [*very high*][criteria-doc] standards for documentation. + +_PROJECT_ is an **incubating** project of CNCF. This means that the project should be [*developing*][criteria-doc] professional-quality documentation alongside the project code. + +| Criterion | Rating (1-5) | +| --- | ---------------- | +| Single-source for all files | (rating value) | +| Meets min website req. (for maturity level) | (rating value) | +| Usability, accessibility, and design | (rating value) | +| Branding and design | (rating value) | +| Case studies/social proof | (rating value) | +| SEO, Analytics, and site-local search | (rating value) | +| Maintenance planning | (rating value) | +| A11y plan & implementation | (rating value) | +| Mobile-first plan & impl. | (rating value) | +| HTTPS access & HTTP redirect | (rating value) | +| Google Analytics 4 for production only | (rating value) | +| Indexing allowed for production server only | (rating value) | +| Intra-site / local search | (rating value) | +| Account custodians are documented | (rating value) | + + + + +## Comments + + + +The following sections contain brief assessments of each element of the Website and documentation infrastructure rubric. + + + + +### Single-source requirement + +Source files for _all website pages_ should reside in a single repo. +Among other problems, keeping source files in two places: +- confuses contributors +- requires you to keep two sources in sync +- increases the likelihood of errors +- makes it more complicated to generate the documentation from source files + +Ideally, all website files should be in the **website repo** itself. +Alternatively, files should be brought into the website repo via [git +submodules][git-submodules]. + +If a project chooses to keep source files in multiple repos, they need a clearly +documented strategy for managing mirrored files and new contributions. + +### Minimal website requirements + +Listed here are the minimal website requirements for projects based on their [maturity level][maturity-level], either incubating or graduated. (These are the only two levels for which a tech docs analysis can be requested.) + +| Criterion | Incubating Requirement | Graduated Requirement | +| --- | --- | --- | +| [Website guidelines][website-guidelines] | All guidelines satisfied | All guidelines satisfied | +| [Docs analysis][analysis-doc] (this) | Requested through CNCF [service desk][cncf-servicedesk] | All follow-up actions addressed | +| **Project doc**: stakeholders | Roles identified and doc needs documented | All stakeholder need identified | +| **Project doc**: hosting | Hosted directly | Hosted directly | +| **Project doc**: user docs | Comprehensive, addressing most stakeholder needs | Fully addresses needs of key stakeholders | + +[git-submodules]: https://git-scm.com/book/en/v2/Git-Tools-Submodules +[website-guidelines]: /../../resources/website-guidelines-checklist.md +[maturity-level]: https://github.com/cncf/toc/tree/main/process#ii-stages---definitions--expectations +[cncf-servicedesk]: https://servicedesk.cncf.io + +### Usability, accessibility and devices + +Most CNCF websites are accessed from mobile and other non-desktop devices at least 10-20% of the time. Planning for this early in your website's design will be much less effort than retrofitting a desktop-first design. + +* Is the website usable from mobile? +* Are doc pages readable? +* Are all / most website features accessible from mobile -- such as the top-nav, + site search and in-page table of contents? +* Might a [mobile-first] design make sense for your project? + +[mobile-first]: https://developer.mozilla.org/en-US/docs/Web/Progressive_web_apps/Responsive/Mobile_first + +Plan for suitable [accessibility][] measures for your website. For example: + +* Are color contrasts significant enough for color-impaired readers? +* Are most website features usable using a keyboard only? +* Does text-to-speech offer listeners a good experience? + +It is up to each project to set their own guidelines. + +[accessibility]: https://developer.mozilla.org/en-US/docs/Web/Accessibility + +### Branding and design + +CNCF seeks to support enterprise-ready open source software. A key aspect of +this is branding and marketing. + +We evaluate on the following: + +* Is there an easily recognizable brand for the project (logo + color scheme) + clearly identifiable? +* Is the brand used across the website consistently? +* Is the website’s typography clean and well-suited for reading? + +### Case studies/social proof + +One of the best ways to advertise an open source project is to show other organizations using it. + +We evaluate on the following: + +* Are there case studies available for the project and are they documented on the website? +* Are there user testimonials available? +* Is there an active project blog? +* Are there community talks for the project and are they present on the website? +* Is there a logo wall of users/participating organizations? + + +### SEO, Analytics and site-local search + +SEO helps users find your project and it's documentation, and analytics helps +you monitor site traffic and diagnose issues like page 404s. Intra-site search, +while optional, can offer your readers a site-focused search results. + +We evaluate on the following: + +* Analytics: + - Is analytics enabled for the production server? + - Is analytics disabled for all other deploys? + - If your project used Google Analytics, have you migrated to GA4? + - Can Page-not-found (404) reports easily be generated from you site + analytics? Provide a sample of the site's current top-10 404s. +* Is site indexing supported for the production server, while disabled for + website previews and builds for non-default branches? +* Is local intra-site search available from the website? +* Are the current custodian(s) of the following accounts clearly documented: + analytics, Google Search Console, site-search (such as Google CSE or Algolia) + + +### Maintenance planning + +Website maintenance is an important part of project success, especially when project maintainers aren’t web developers. + +We evaluate on the following: + +* Is your website tooling well supported by the community (i.e., Hugo with the + Docsy theme) or commonly used by CNCF projects (our recommended tech stack?) +* Are you actively cultivating website maintainers from within the community? +* Are site build times reasonable? +* Do site maintainers have adequate permissions? + +### Other + +* Is your website accessible via HTTPS? +* Does HTTP access, if any, redirect to HTTPS? + +## Recommendations + + + +### Single-source requirement + +### Minimal website requirements + +### Usability, accessibility and devices + +### Branding and design + +### Case studies/social proof + +### SEO, Analytics and site-local search + +### Maintenance planning + +### Other + + + + +[project-website]: _PROJECT-WEBSITE_ +[project-doc-website]: _PROJECT-DOC-URL_ +[criteria-doc]: ./criteria.md +[implementation-template]: ./implementation-template.md +[issues-template]: ./issue-template.md +[umbrella-template]: ./umbrella-issue-template.md +[implementation-doc]: ./_PROJECT_-implementation.md +[issues-doc]: ./_PROJECT_-issues.md +[project-heading]: #project-documentation +[contributor-heading]: #contributor-documentation +[website-heading]: #website +[rfc-spec]: https://www.rfc-editor.org/rfc/rfc2119 +[website-guidelines]: ../../resources/website-guidelines-checklist.md + diff --git a/analysis/analysis-tools/criteria.md b/analysis/analysis-tools/criteria.md index f74d25e..138ad68 100644 --- a/analysis/analysis-tools/criteria.md +++ b/analysis/analysis-tools/criteria.md @@ -137,10 +137,12 @@ Examples: ### Single-source requirement -Source files for _all website pages_ should reside in a _single_ repo. -Otherwise, having source files in two places will confuse contributors (who -won't know which file(s) to update) and you'll run the risk of losing updates -— [as has happened already][otel-changes-lost]. +Source files for _all website pages_ should reside in a single repo. +Among other problems, keeping source files in two places: +- confuses contributors +- requires you to keep two sources in sync +- increases the likelihood of errors +- makes it more complicated to generate the documentation from source files Ideally, all website files should be in the **website repo** itself. Alternatively, files should be brought into the website repo via [git @@ -149,9 +151,6 @@ submodules][]. If a project chooses to keep source files in multiple repos, they need a clearly documented strategy for managing mirrored files and new contributions. -[otel-changes-lost]: https://github.com/open-telemetry/opentelemetry.io/issues/673 -[git submodules]: https://git-scm.com/book/en/v2/Git-Tools-Submodules - ### Minimal website requirements Listed here are the _minimal_ website requirements for projects based on their @@ -220,6 +219,7 @@ Plan for suitable [accessibility][] measures for your website. For example: It is up to each project to set their own guidelines. [accessibility]: https://developer.mozilla.org/en-US/docs/Web/Accessibility + ### Branding CNCF seeks to support enterprise-ready open source software. A key aspect of diff --git a/analysis/analysis-tools/howto.md b/analysis/analysis-tools/howto.md index fdaff04..f4c56d9 100644 --- a/analysis/analysis-tools/howto.md +++ b/analysis/analysis-tools/howto.md @@ -1,51 +1,111 @@ -# CNCF Project documentation assessments +# CNCF TechDocs Analysis How-To + +## Audience + +This document is for members of the CNCF TechDocs team, including contractors or consultants, who need to conduct or assist with an analysis of a CNCF open-source project's technical documentation. -**Who this document is for:** Members of the CNCF Techdocs team. This document provides guidance and a template on executing documentation assessments for CNCF projects. ## Purpose -The aim of a documentation assessment is to: +The goals of a CNCF technical documentation analysis are to: + +- Examine the current project technical documentation and website against the CNCF's analysis framework, as described in the doc analysis [criteria](./criteria.md). +- Compare the documentation against the current or proposed maturity level for the overall project. +- Recommends a program of key improvements with the largest return on investment. These improvements are documented as *recommendations* in the analysis document and expanded in a companion [implementation plan](./implementation-template.md) and [issues backlog](./umbrella-issue-template.md). + + +## Doing a Tech Docs Analysis + +The tech docs analysis consists of some repository bookkeeping (Prerequisites), then of three overall tasks: + +1. Write the analysis document: Evaluate the existing project documentation with respect to the project maturity level (or proposed maturity level, if the analysis is associated with an upgrade reqeust). Identify gaps with CNCF criteria. Write general recommendations to close the largest and most important gaps. +2. Write the implementation plan: Decompose the recommendations to specific improvement suggestions. These can be additions or revisions to the docs; reorganization; website infrastructure changes; or any other work that will close the gaps. Make suggestions specific (if you propose reorganizing a section, for example, provide an outline) but provide enough information that a contributor could solve the problem differently if they have a different idea (make it clear that your proposed outline is only one possible reorganization, e.g.). +3. Write the issue backlog. + +Finally, there are follow-up steps including creating GitHub issues and a pull request, and getting approval from project maintainers. + +### Prerequisites + +This process assumes you have some familiarity with GitHub repositories and pull requests (PRs). + +1. Clone the [CNCF tech docs repository](https://github.com/cncf/techdocs). +1. Create a branch for the analysis. +1. In the new branch, create a directory for the analysis in the CNCF tech docs /analysis directory. Name the directory `00NN-_PROJECT_`, where *NN* is the next index available in the directory (check for PRs as well, if someone else is working on tech doc analyses), and where _PROJECT_ is a short but not abbreviated project name. For example, for Kubernetes _PROJECT_ would be *kubernetes*, not *k8s*. +1. Copy and rename the analysis doc templates from the `/analysis/analysis-tools` directory as follows: `analysis-template.md` > `_PROJECT_-analysis.md`; `implementation-template.md` > `_PROJECT_-implementation.md`; and `umbrella-issue-template.md` > `_PROJECT_-issues.md`. + + +### Writing the Analysis document + +Edit `_PROJECT_-analysis.md` and follow these steps to complete the first step, the analysis: + +1. Define the scope of the analysis. Edit "Scope of analysis" to reflect URLs and repositories included and excluded from the analysis. +1. Review the in-scope URLs and repositories for compliance with the rubric criteria. Note any gaps, as well as any areas that exceed criteria or are exceptionally well executed. I find it easiest to do this separately for each of the three areas of concern (project doc, contributor doc, website), making a pass through the documentation once for each section (Information architecture, New user content, Content maintainability, etc.). Don't worry about a numerical score during this step; instead, note how the documentation complies, or not, with each criterion with respect to the project maturity level (or proposed maturity level, if the analysis is part of a petition for upgrade). Write comments to note the most important gaps and best-executed features of the documentation. +1. Assign ratings to each criterion based on your comments and compliance with the maturity level expectations in the rubric. The ratings are self-explanatory. Keep in mind that "needs improvement" or "meets standards" is with respect to the current (or proposed) maturity level. +1. Write recommendations. The template implies that you'll do this for every criterion; the "Recommendations" headings mirror the "Comments" headings. However, if some alternative framework makes more sense, use that. For example, it might be that two or three of the product documentation criteria are improved by reorganizing the documentation. In this case, rather than repeat the recommendation to reorganize in each section, write a single recommendation and explain how it improves all the areas. This is the first step in moving from the analysis to specific, actionable, time-bound backlog items. + +#### General tips + +Things to keep in mind while doing the analysis: +- Look for: + - Quick wins – low-effort changes that would have a major impact. + - Large, systemic issues that you can organize a number of issues around. + - The two or three most important issues that impede documentation effectiveness. + - Anything the project does exceptionally well. We can call these out as examples in other evaluations! +- Don't get bogged down in detail when writing comments. Include enough detail that you can describe how to implement a suggested solution. A sentence or two is sufficient for most issues. +- Keep in mind the overall goal of the technical documentation: to make its users more effective. Focus on issues that get in the way of that goal. +- It is not necessary to come up with a recommendation in every case, especially for elements that are satisfactory or if a recommendation would result in minimal improvement. +- Don't worry about grammar and style if they don't affect documentation effectiveness. If writing style impedes understanding, make a note; otherwise move on. An exception: Insist that tasks and procedures be written as clear, step-by-step instructions. +- Some of the criteria, especially around contributor documentation, project governance, and website infrastructure, are essentially check-boxes. These you can quickly investigate, note, and move on. Spend more time analyzing the effectiveness of the project documentation. + +#### Common issues + +Many common issues seem to come about because open-source software documentation is written by software developers while they are writing the software. This results in documentation that: +1. Is organized around features, not users and use cases. +2. Explains technical concepts well, including architecture and design decisions. +3. Contains complete reference information for APIs, CLIs, and configuration parameters. +4. Has missing or incomplete user-oriented "how-to" explanations and operational procedures. + +You may or may not find the following issues in your analysis, but it's worth keeping them in mind: +- Ambiguity around user roles. +- Missing or unclear task-based documentation. +- Assumptions about the reader's level of knowledge. +- Organization that buries or scatters important information, especially tasks and procedures. +- Missing or unclear new-user workflows. + +### Writing the implementation plan -- Measure against the CNCF’s standards for documentation -- Recommend areas to improve -- Provide examples of great documentation as reference -- Identify key areas which will net the largest improvement if addressed +Write the implementation plan. Edit the `_PROJECT_-implementation.md` file. -**What an assessment is** +The gist of the implementation plan is to break down the recommendations in the analysis document. This is an intermediate stage between general recommendations and the issues backlog. For small projects and where the recommendations are independent and time-bound, an implementation plan might not be necessary and you can move straight to writing the backlog. -- An overview with specific recommendations -- As short as possible – err on the side of bulleted lists -- Geared at providing actionable feedback (litmus test: could you turn a piece of feedback into a backlog issue with little extra work?) +If you do write an implementation plan, start with recommendations in the analysis document. Rewrite the recommendations, making them more specific, with a suggested (but not mandatory) implementation. For example, if you recommend reorganizing the documentation, provide a suggested outline, along with an explanation of the reason for the reorg. -**What it isn’t** +### Writing an issues backlog -- Full lists of exhaustive solutions for each and every issue with a website -- Fluffy and unspecific -- Judgmental +Write an issues backlog. Edit `_PROJECT_-issues.md`. -## Assessment criteria and examples +The goal of writing an issues backlog is to offer project contributors the opportunity to make the recommended changes. -See [Assessment definitions and reference examples](criteria.md). +Rewrite each action in the implementation plan. If possible, break large actions into smaller issues. Each issue should be: +- Independent. As much as possible, no issue should have another issue as a prerequisite. A contributor should be able to choose an issue, resolve it, and close it without reference to any other issue. +- Time-bound. A contributor should be able to complete an issue in a reasonably short time, say a few hours or a couple of days at most. -**Doing an assessment** +Make the suggested solution even more specific. At this point, the issue should almost be a recipe for making the doc improvement, with the caveat that a contributor is not required to implement the solution as suggested in the issue. -When doing an assessment: +## Next Steps -1. Locate a project’s main documentation repository and any community/contributor/org/governance repositories -Review all documentation available & the website (if present). +### Including supporting documentation -2. Note: +If you have supporting material that might be helpful to a contributor working on the documentation issues, include them in the directory with the other documents. For example, you might inventory the existing tech doc pages in a spreadsheet; in this case, include a CSV file of the inventory. - - Any quick wins - - Larger, systemic issues at play - - Main issues to fix - - What the project does exceptionally well! +### Creating a pull request -3. Draft the assessment using the template provided +If you have not created a pull request with the analysis documents, do so now. Tag project maintainers and CNCF documentation staff, and ask for comments. -4. Send it to the CNCF techdocs team for a review +### Getting contributor feedback -5. Send it to the project maintainers and schedule a zoom meeting to discuss with them in person. +If you haven't met with the project's maintainers yet, do so before you create the issues in GitHub. Ideally you'd like to have a Zoom meeting with any interested parties to get feedback on the analysis and implementation plan. -6. PR to the techdocs repository in this directory for archiving +### Creating GitHub issues +Enter the backlog issues from the issues document into the project documentation GitHub repository using the format in the umbrella-issues-template.md and issues-template.md files. Create one GitHub issue per backlog issue, and create an umbrella issue that contains a checklist item for each issue. diff --git a/analysis/analysis-tools/implementation-template.md b/analysis/analysis-tools/implementation-template.md new file mode 100644 index 0000000..d73f4d9 --- /dev/null +++ b/analysis/analysis-tools/implementation-template.md @@ -0,0 +1,35 @@ +--- +title: Implementing _PROJECT_ Doc Improvements +tags: _PROJECT_ +--- + +# Introduction + +This document provides actionable suggestions for improving the _PROJECT_ technical documentation. + +For an analysis and general discussion of recommendations on _PROJECT_ technical documentation, see [_PROJECT_-analysis.md][]. + +## Recommendations, requirements, and best practices + +This analysis measures documentation against CNCF project maturity standards and suggests possible improvements. In most cases there is more than one way to do things. Few recommendations here are meant to be prescriptive. Rather, recommendations are based on documentation best practices as understood by the reviewers. The recommended implementations represent the reviewers' experience with how to apply those best practices. In other words, borrowing terminology from the lexicon of [RFCs][rfc-keywords], the changes described here should be understood as "recommended" or "should" at the strongest, and "optional" or "may" in many cases. Any "must" or "required" actions are clearly denoted as such, and pertain to legal requirements such as copyright and licensing issues. + +The top-level documentation recommendations for this project are: + + + +# High-level action 1 + +## Issue 1 + +## Issue 2 + +# High-level action 2 + +## Issue 1 + +## Issue 2 + + + + +[rfc-keywords]: https://www.rfc-editor.org/rfc/rfc2119 diff --git a/analysis/analysis-tools/issue-template.md b/analysis/analysis-tools/issue-template.md new file mode 100644 index 0000000..6b8a071 --- /dev/null +++ b/analysis/analysis-tools/issue-template.md @@ -0,0 +1,40 @@ +--- +title: _PROJECT_ Issue +tags: _PROJECT_ +--- + + + +# Overview + + + +Audience: + +Type: + +# Context + + + +This issue tracks recommended changes resulting from an analysis of the etcd documentation commissioned by CNCF. The analysis and supporting documents are here: https://github.com/cncf/techdocs/tree/main/assessments/00NN-project/ + +# Possible Implementation + + + +Related material in the current doc: +- https://github.com/_PROJECT_/website/tree/main/content/en/docs/v3.5/tutorials + + + +Suggested changes: + +Use the following outline to write a procedure for each task: + +- Prerequisites (bullet list of prerequisite conditions, if any) +- Procedure + 1. Step 1 (keep steps discrete and atomic. Put command-line input and expected output in a code block.) + 2. Step 2 ... +- Result (optional; describe output or side effects if they're notable or unexpected.) + diff --git a/analysis/analysis-tools/template.md b/analysis/analysis-tools/template.md deleted file mode 100644 index 03c42f0..0000000 --- a/analysis/analysis-tools/template.md +++ /dev/null @@ -1,112 +0,0 @@ -# Assessment template - -Prepared by: \ ([@add-link-to-your-github-id](https://github.com/cncf/techdocs))
-Date: 2021-mm-dd - -## Introduction - -This document assesses the quality and completeness of a project's documentation and website (if present). - -This document: - -- Measures existing documentation quality against the CNCF’s standards -- Recommends specific and general improvements -- Provides examples of great documentation as reference -- Identifies key improvements with the largest return on investment - - -## How this document works - -The assessment is divided into three sections: - -- **Project documentation:** for end users of the project; aimed at people who intend to use it -- **Contributor documentation:** for new and existing contributors to the project -- **Website:** branding, website structure, and maintainability - -Each section rates content based on different [criteria](criteria.md). - - -## Project documentation - -| Criteria | 1 | 2 | 3 | 4 | 5 | -| --- | --- | --- | --- | --- | --- | -| Information architecture | | | | | | -| New user content | | | | | | -| Content maintainability | | | | | | -| Content creation processes | | | | | | - -Scale: -- 1 = (Is not present or requires significant work) -- 3 = (Is present, but needs work) -- 5 = (Is executed extremely well or no improvement required) - -**Comments** - -_Provide comments for each rating above, 1-2 sentences max, bullet point list_ - -**Recommendations** - -_Provide a list of recommendations to improve in this area_ - - -## Contributor documentation - -| Criteria | 1 | 2 | 3 | 4 | 5 | -| --- | --- | --- | --- | --- | --- | -| Communication methods documented | | | | | | -| Beginner friendly issue backlog | | | | | | -| “New contributor” getting started content | | | | | | -| Project governance documentation | | | | | | - -Scale: -- 1 = (Is not present or requires significant work) -- 3 = (Is present, but needs work) -- 5 = (Is executed extremely well or no improvement required) - -**Comments** - -_Provide comments for each rating above, 1-2 sentences max, bullet point list_ - -**Recommendations** - -_Provide a list of recommendations to improve in this area_ - - -## Website - -| Criteria | 1 | 2 | 3 | 4 | 5 | -| --- | --- | --- | --- | --- | --- | -| Single-source for all files | | | | | | -| Meets min website req. (for maturity level) | | | | | | -| Branding and design | | | | | | -| Case studies/social proof | | | | | | -| Maintenance planning | | | | | | -| A11y plan & implementation | | | | | | -| Mobile-first plan & impl. | | | | | | -| HTTPS access & HTTP redirect | | | | | | -| Google Analytics 4 for production only | | | | | | -| Indexing allowed for production server only | | | | | | -| Intra-site / local search | | | | | | -| Account custodians are documented | | | | | | - -Scale: -- 1 = (Is not present or requires significant work) -- 3 = (Is present, but needs work) -- 5 = (Is executed extremely well or no improvement required) - -**Comments** - -_Provide comments for each rating above, 1-2 sentences max, bullet point list_ -_Include a list of the top 404s, as reported through analytics or a search console._ - -**Recommendations** - -_Provide a list of recommendations to improve in this area_ - - -## Recommendations - -_From the recommendations above, lis the top 1-3 concerns for this particular project and expand on them in enough detail that you could either:_ - - _Pass the work off to a contractor or other member of the CNCF techdocs team_ - - _Write a GitHub issue for the project team and place it in the backlog and someone not involved in the docs assessment process could execute on it_ - diff --git a/analysis/analysis-tools/umbrella-issue-template.md b/analysis/analysis-tools/umbrella-issue-template.md new file mode 100644 index 0000000..27e2006 --- /dev/null +++ b/analysis/analysis-tools/umbrella-issue-template.md @@ -0,0 +1,31 @@ +--- +title: _PROJECT_ Umbrella Issue +tags: _PROJECT_ +--- + +# Overview + + + + + + + +This issue tracks recommended changes resulting from an analysis of the _PROJECT_ documentation commissioned by CNCF. The analysis and supporting documents are here: https://github.com/cncf/techdocs/tree/main/assessments/00NN-project/ + +The CNCF etcd documentation effort is tracked in the CNCF Tech Docs repo: +https://github.com/cncf/techdocs/issues/NNN + +# Issues + +This is a list of issues representing the recommended work on the _PROJECT_ website and technical documentation. + +## Issue: Item 1 + + + +- [ ] https://github.com/project/repo/issues/NNN + +## Issue: Item 2 + +