I'm skimming a
recent report on federal government websites. Page 12 shows the primary audiences which range from Federal employees, consumers, business, researchers, etc. etc. but no taxpayers.
(Can't copy it or I would) Some excerpts, with bracketed comments:
86% of the live domains and 71% of the domains under development had been updated in the past six months, as of October, 2011, when agencies conducted the inventories. [Updating within a 6-month period is a very low threshold.]
Takeaways: [as labelled by the report}
Inconsistency across agencies:The amount of data varied greatly across agencies. Some agencies were able to provide more complete data, while other agencies struggled to develop a clear picture of their web footprint because of decentralized operating units.
Incomplete data: Several agencies did not know the answers to all of the questions, and many noted that this inventory is the first of its kind in their agency.
Decentralization: Nearly all of the agencies alluded to the fact that much of the decision-making with regard to specific domains/websites happens within operating units and not at an agency level.
Varying levels of maturity: Some agencies have clearly set web policies, while many agencies are still working to develop more formal web guidance and governance policies.
Need for more Federal guidance: Many agencies asked for additional guidance and assistance in developing integrated web governance plans and migration processes for their domains.
Dedication to improvement: Nearly all of the agencies made comments to illustrate their dedication to improving web governance and communications at their agency.
Benefits may come at a cost: A few agencies noted that the benefits of integration are extremely important but that integration may come at a cost.
Measurement takeaways:
Lack of consistent performance metrics: Nineteen of the major agencies (79%) reported that they did not use the same performance metrics to consistently evaluate agency websites across the agency; each site uses its own combination of methods.
Metrics not standardized: Several agencies commented that even though the same tools are used, the metrics from those tools are not consistently gathered, implemented and applied. Web analytics is the most commonly used method: Most agencies (10 out of 24) referred to using web analytics tools to measure performance.
[I wish they had collected and published the metrics, or at least noted if any websites published the metrics.]
Here's the
link to the "dialog" website they used to gather public comments.