Skip to content
This repository has been archived by the owner on Nov 8, 2022. It is now read-only.

Latest commit

 

History

History
450 lines (333 loc) · 17.1 KB

Agile-Artifacts.md

File metadata and controls

450 lines (333 loc) · 17.1 KB

agile_process_v1

Our Agile Process

We executed this challenge with an agile approach, delivering working software every four hours, and building in an ability to respond to change. Initially the project was planned over a 4 day period - with 2 sprints and 1 release each day. We added a couple of sprints (sprint 8 and 9) to increase the usability of the prototype at the end.

We had an part-time agile coach to help guide our process and encourage whole-team collaboration.

Sprint Zero

Before we began core development, we spent time in our Sprint 0: setting up our development environment, including a continuous delivery infrastructure, and going through ideation/human centered design techniques to understand the value we could create and ultimately decide what to build. We also ensured we had a cross-functional team and a productive co-located space to work.

Release 1 to 4 took place over a period of 4 days (2 sprints per day), with a release every day. Release 5 took place over a period of 2 days (2 8 hour sprints) - with only front-end developers and a usability tester working these sprints. The purpose of release 5 was to increase the usability of the site based on feedback from usability testing.

Release 1 (Day 1)

  • Sprint 1 (4 hours)
  • Sprint 2 (4 hours)

Release 2 (Minimal Viable Product) - Day 2

  • Sprint 3 (4 hours)
  • Sprint 4 (4 hours)

Release 3 (Day 3)

  • Sprint 5 (4 hours)
  • Sprint 6 (4 hours)

Release 4 (Day 4)

  • Sprint 7 (4 hours)

Release 5 - only front-end development to increase usability based on usability testing

  • Sprint 8 (8 hours)
  • Sprint 9 (8 hours)

As we executed, we followed a simple pattern:

  1. Release Planning - Plan each day as a release
  2. Sprint Planning - Plan and execute two 4-hour sprints per day, using whole-team estimation and buy-in
  3. Sprint Demonstration - Demo each Sprint's output, receiving feedback from the team and interested parties
  4. Retrospective - Conduct a retrospective after each release, to adapt our work for the next release

Work was captured as user stories (expressing a need and value), and broken down into technical tasks. We estimated each task and captured the planned user stories for each sprint. We also captured the velocity for each sprint and used that as input into our next sprints.

User stories are archived on our wiki here - https://github.com/booz-allen-agile-delivery/ads-final/wiki/User-Stories

Team Velocity and User Stories Completed

Forecast v Delivered

Velocity

##Agile Practices Used

  1. Ideation/visioning session
  2. Pair programming and peer review
  3. Kanban board (Physical)
  4. Sprint planning
  5. Whole-team planning
  6. Timeboxed iterations (4 hrs)
  7. Sprint review
  8. Co-location/Osmotic communication
  9. Sprint planning and review
  10. Release planning and review
  11. User stories
  12. Prioritized backlog
  13. API-driven development
  14. Wireframes/Mockups
  15. Human-Centered Design (incl. user research, brainstorming, dot-voting, wireframes/mockups, usability testing)
  16. Regular user-feedback
  17. Cross-functional team
  18. Retrospectives (after each release)
  19. Frequent demos
  20. Personas
  21. Product Owner
  22. Roadmaps
  23. Agile estimation (story points), whole-team estimation

Sprint 0 Activities

These activities took place prior to beginning primary delivery efforts:

  1. Ideation - activities to decide what to build, how we could deliver value
  2. Establishing continuous integration/delivery infrastructure
  3. Establishing development environment
  4. Assembling the team
  5. Finding the co-located space
  6. Establishing an initial backlog

Team collaboration

team_203531399_ios One of the backend-developers working with the DevOps team to diagnose an issue with the deployment. The Agile Scrum wall can be seen in the background.

team_203539152_ios In the background, the front-end designer consult with one of the subject matter experts, a 25-year verterin of the FDA, who stopped by to offer suggestions.

dotvoting2 The team votes on a name of a product by placing colored stickies onto product names of their choice.

team_215402232_ios One of the front-end developer's and the DevOps team watch as the backend team demonstrate one of the new features.

Day 1

Schedule

daily_schedule The daily schedule post in the team room depicting four-hour sprints followed by all-hands demonstrations and retrospectives.

  • 8:30 AM Breakfast
  • 9:00 AM Sprint 1 begins
  • 1:00 PM Demo and Retrospective
  • 1:00 PM Lunch
  • 2:00 PM Sprint 2 begins
  • 6:00 PM Dinner and Retrospective

day_1_starting_activities The team worked on the plan for how to get-started on the first day.

Sprint 1

Primary target for this first Sprint: User story 001: As a consumer, I want to search and select a drug so that I can see more information about that drug.

sprint_1_plan

Planned for this sprint:

  • UI for search (2 pts)
  • Structure a service to access FDA API (2 pts)
  • Mockups/styles for Search & Select (2 pts)
  • Complete the Digital Service Playbook checklist (1 pt)
  • Define git flow (1 pt) (Total planned 8 points)

Significant events

  • During this sprint, we started exploring a new way we could crowdsource keeping drug labels up to date; ongoing conversation and design sessions will drive some future stories

At the Sprint 1 Demo:

  • User Story 001 done
  • Drug search bar - can type into it, will bring up a drug and brief description
    • Running on local - not deployed yet
    • Completed stories:
    • UI for search (2 pts)
    • Structure a service to access FDA API (2 pts)
    • Mockups/styles for Search & Select (2 pts)
    • Complete the Digital Service Playbook checklist (1 pt)
  • Incomplete stories:
    • Define git flow (1 pt) (Completed 7 points)

sprint_1_done

Feedback:

  • Approach Sprint 2 with test-first methods/have more unit tests
  • Run Sprint 2 demo on a "Release" environment

Completed 7 points out of the 8 points forecast

Sprint 2

sprint_2_plan

User story 002: As a consumer, I want to see a list of reported adverse effects for a selected drug so I can understand the risk of taking the drug.

Stories planned:

  • Search by brand name for adverse events (1)
  • populate list of adverse effects (2)
  • Fix search UI bugs (3)
  • Unit tests for UI (1)
  • 3 environments (3)
  • Napkin usability (2)
  • Wire for "napkin" (dependency) (2)

Forecast a total of 14 points for this sprint

At the Sprint 2 Demo:

  • Search by brand name for adverse events (1)
  • populate list of adverse effects (2)
  • Fix search UI bugs (3)
  • Unit tests for UI (1)
  • 3 environments (3)
  • Napkin usability (2)
  • Wire for "napkin" (dependency) (2)

Completed 14 points

sprint_2_done

Day 1 Retrospective

day_1_retro

Positive:

  • Lunch
  • Coffee
  • Build pipeline
  • Accommodate scope change
  • SME involvement
  • Forecasts worked
  • People spoke up
  • Demos
  • 4 hour iterations

Negative:

  • Missing key resource (sick)
  • Pivoting conversation may have been interruptive
  • Documentation artifacts perhaps behind

Change

  • Start promptly at 9 AM
  • Review documents Wednesday afternoon
  • Get desks, monitors (Nico)
  • Get cleanup

Day 2

Day 2 Overall Plan

We planned our overall release, along with some additional functionality which would likely come along in Day 3 -- prioritized, we had our guiding plan: day_2_plan

Sprint 3

Focus for Sprint 3: Making our Day 1 progress smoother, better-looking, easier to use; and beginning progress on the Crowdsourcing user story -- primarily on the backend.

sprint_3_plan

Graphical refinement for user stories 001 & 002 (landing page, search, list of reported effects)

  • Enable search on enter/select (1)
  • Filter autocomplete to reduce duplicates (2)
  • Reviewing site text, title (2)
  • How to be agile diagram (2)
  • Revise reported adverse effect API to return top 50 (1)

User story 004: As a crowdsourcing user, I want to record adverse effects mentioned on the label, based on a scan/preselection of the label text, using simple clicks, so that I can contribute to the value of the database in this tool.

Tasks planned:

  • Graph mockup (1)
  • Crowd sourcing wires (2)
  • Database/scheme for storing label adverse effects data (1)
  • Describe API FE-BE for accessing suggestion adverse effects from label (2)
  • API for submitting an adverse effect from the label (2)
  • Implement search of label text, return suggested (2)
  • Rails, models 2 APIs for label adverse effects (1)

Forecasted 19 points

sprint_3_done At the sprint 3 demo:

  • Completed 17 points
    • Graph mockup (1)
    • Crowd sourcing wires (2)
    • Database/scheme for storing label adverse effects data (1)
    • Describe API FE-BE for accessing suggestion adverse effects from label (2)
    • API for submitting an adverse effect from the label (2)
    • Implement search of label text, return suggested (2)
    • Rails, models 2 APIs for label adverse effects (1)
  • Incomplete stories:
    • Implementing the search to return suggested adverse effects from the label text

Sprint 4

Plan sprint_4_plan

User story 004: As a crowdsourcing user, I want to record adverse effects mentioned on the label, based on a scan/preselection of the label text, using simple clicks, so that I can contribute to the value of the database in this tool.

Tasks planned:

  • Implement search of label text, return suggested (2)
  • UI to show label text & suggested adverse effects (needs wire) (2)
  • Submit effects from user to backend (2)
  • UI for submitted state (1)
  • UI in place for text submission (1)
  • Chart add seriousness (2)
  • Persona for crowdsourcing foundation/address (1)
  • Add generic bottle picture (1)
  • Reviewing site text, title (3)
  • Mockup (HiFi) crowdsourcing (2)

Documentation planned:

  • Continuous integration details (1)
  • Digital service play evidence (1)
  • Continuous deployment details (1)
  • Digital services check list (1)
  • Continuous delivery details (1)
  • End-to-end continuous delivery w/ tools + products - raft update (1)

Forecasted 24 points

At the sprint 4 demo:

  • Completed points
    • Graph mockup (1)
    • Crowd sourcing wires (2)
    • Database/scheme for storing label adverse effects data (1)
    • Describe API FE-BE for accessing suggestion adverse effects from label (2)
    • API for submitting an adverse effect from the label (2)
    • Implement search of label text, return suggested (2)
    • Rails, models 2 APIs for label adverse effects (1)
  • Incomplete points:
    • Implementing the search to return suggested adverse effects from the label text

Day 2 Retrospective

day_2_retro

Start:

  • Setup Rails environments

Stop:

  • Merges right before demos

Do More:

  • Front-end unit tests
  • Release the Master
  • Front end documentation
  • Peer Review

Do Less:

  • (nothing)

Day 3

Day 3 Roadmap

day_3_roadmap

  1. Site text, vision, message (content) Style guide
  2. Finish submitting suggested label adverse effects
  3. Submit a new (text entry) adverse effect
  4. Ensure responsive design
  5. Deploy continuous monitoring
  6. Display crowdsource statistics -> filling pill bottle indicator?

Report + fix bugs

Sprint 5

sprint_5_plan_with_us

Primary target for the fifth sprint: continuing to work on user stories 004, 005, and 006.

Planned for this sprint:

  • Intro purpose statement (1)
  • Site test (3)
  • Submit effects from user to BE (rec & test) (2)
  • UI + text for submitted state (2)
  • Determine responsive design concerns (1)
  • Record/Publish CI/test metrics (2)
  • Integrate Promotheus (3)
  • Create alerts (1)

Forecasted 15 points

Significant events

  • During this sprint, we decided to shift from an earlier "checkbox" UI approach, to a simple "yes-no" question approach, for a better UX

sprint_5_mid_sprint_adjust So we rewrote User Story 006 as User Story 007, and realized we didn't need User Story 005.

Sprint 5 Demo

  • We did not finish creating the continuous monitoring alerts in Sprint 5 (1 pt)
  • We did complete some unplanned usability testing on our crowdsourcing feature (2 pts)
  • Based on usability feedback, we were able to jump ahead to the (unplanned for this sprint) upcoming task of implementing the simpler "yes/no" UI for our crowdsourcing feature (2 pts)
  • Based on usability feedback, we completed a change to our site title, from MineMed to CrowdMed (1 pt)

sprint_5_done

Sprint 6

Sprint 6 Plan

sprint_6_plan

Primary target for the sixth sprint: continuing to work on user stories 004 and 007.

Planned for this sprint:

  • Decide on "help" text (1)
  • Show 5 sentences, label text each time (not whole label) (3)
  • Bold/emphasize a found word from the label text (2)
  • Implement usability recs from testing (2)
  • Obscure upcoming questions (2)
  • Deploy site text onto site (2)
  • Define "progress" or "verified" for crowdsourced data (2)
  • Chart mockup sync (2)
  • Research alerts (2)
  • Make load testing available (1)
  • Publish/viz more metrics (2)
  • Wiki doc for Human Centered Design (3)

Forecasted 24 points

Sprint 6 Review

15 points done: sprint_6_done_2

Incomplete Sprint 6 items: in-progress-after-sprint-6

Sprint 7 Plan

sprint 7 plan

Primary target for the sixth sprint: continuing to work on user stories 004 and 007.

Planned for this sprint:

  • Update "help" text (1)
  • Write value proposition on why to contribute (1)
  • Write instructions on how to report averse effects (1)
  • Fix progress bar on crowdsourcing chart (1)
  • Fix Weird Dates on frequency chart (1)
  • Define "progress" or "verified" for crowdsourced data (2)
  • Overall Page Style (10)
  • Publish/viz more metrics (2)
  • Create Monitoring Alerts (1)
  • Wiki doc for Human Centered Design (3)

Forecasted 23 points

Sprint 7 Review

sprint 7 done 23 points done:

Incomplete Sprint 7 items: None

Sprint 8

Sprint 8 Plan

Primary Target for this sprint – cleanup and usability

Forecasted points = 6

  • Landing Page Updates (1)
  • Update Leader Board (2)
  • Graph – color for Icons and Legend (1)
  • Fix Tree Graph (2)

sprint 8 plan

Sprint 8 Review

sprint 8 done

All tickets successfully completed - 6 story points

Sprint 9

Sprint 9 Plan

sprint 9 plan

Sprint 9 Review

sprint 9 done