Skip to main content
Item Performance Report

Measure performance based on time logged vs time planned by staff, item or client.

Miel De Rycke avatar
Written by Miel De Rycke
Updated over a year ago

This report brings you one of the key metrics every agency should be looking at: are we using more or less time to do our work than we plan?

While the Job Profit & Recovery report measures the financial performance on a project level, it is difficult to pinpoint who in the team is responsible for the actual profit. While there is a very strong correlation between time recorded and billing, there is no direct connection between these two.

The key metrics this report uses are time logged and time planned against each item.


This report is a companion to the Items Report, using the same code and data. The button at the top of the page lets you toggle between the different modes.

Filters

When you open the report, you can see the collapsible filter side panel. Click the vertical gray bar to show or hide it.

You always start using this report by applying filters on the data from Streamtime. Generally speaking, it is advised to run the performance report on completed items only (If you're only half way through a specific item, there is no real point measuring performance on it).

Additionally, you'll want to choose a specific timeframe. On one hand, the average performance over the last 2 weeks may not be very significant, because the timeframe is too short for any meaningful results. On the other hand, a 4 year timeframe doesn't take into account a team member's recent performance improvements.

So a good way to start is to find all items with a completion date in the last 12 months.

As soon as you modify the filters, the report automatically starts to fetch and process the data. As you change the filters, existing API calls are aborted and replaced with new searches.

Based on the selected filters, wayahead fetches jobs, job items and job item users (item assignments) from Streamtime.

Main View and Tabs

The main report view offers 3 tabs:

  • A summary table by team member

  • A summary table by client

  • A summary table by item name

Staff Summary

The is probably the most important table the report and the key metric for each user is the Average Time Used. In the example above Jeffrey Lebowski is shown to have used an average of 91% of the time. That means that on average, Jeff completed his tasks 9% faster than we expected him to. Good for him. Jesus on the other hand is using 9% more than allowed on average.

The table shows the following columns:

  • Staff Name: this column shows the staff name AND the number of items included in the summary. So Jeffrey Lebowski (103) means there are 103 items captured for Jeffrey in this summary. The higher this number, the more meaningful the results. The average performance across 2 tasks, doesn't really tell us much.

  • Team name: As defined in the Reporting Preferences, either based on Streamtime staff labels or team names configured in the wayahead Team Preferences.

  • Used Time: Total time logged across all items.

  • Planned Time: Total time planned for this user across all items. Note that this is NOT the same as the total planned time on the items. See solo vs shared.

  • Accuracy: Shows how reliable the total score is, based on the number items included in the score vs the total number of items found. See accuracy below for more info.

  • Solo Items: Score for items this person was assigned to alone. (Also shows number of items included in summary)

  • Shared Items: Scrore for items that were shared between this person and other team members. (Also shows number of items included in summary)

  • Average Time Used: The is the final average score for this person across all tasks. %100 is spot on. Under 100% means this person is faster than planned. Over 100% means this person uses up more time than allowed.

  • Graph: The graph represents the average time used. It is green under 80%, turns yellow under 100%. Blue means 100%. Then over 100% the graph reverts back on itself, so you could clearly see the difference between someone who is 4% over vs 40% over. In the second case, there will be more red on the graph.

Client Summary

This summarises all the same items by client name. Just like the Staff Summary it shows how many items are included in the total score. It shows the total logged time vs the total planned time and the total score.

The account manager column represents the Account Manager defined on the company in the Streamtime company list.

Item Summary

The item summary is similar to the client summary. For some agencies this summary will be a lot more useful than for others – it depends on how many different item names you have across all of your projects.

While Streamtime has a master list of items, linked to the accounting codes of Xero, unfortunately, it doesn't have an underlying structure that remembers the original item or accounts code for each item in your jobs. So in this summary everything is summarised by the item name instead. Given that you can name an item whatever you want, can make it difficult to make sense of this info.

To get better data, it is advised to leave the item name as it appears in the master list, and use the item description to distinguish between them. Alternatively, you might choose specific codes in front of the item names, e.g. "[DEV] App Development", so you can search the report for all the [DEV] and find a grouped total.

Important Metrics

This is not a straightforward report to render. Streamtime is a very flexible system that lets you set up your jobs in many different ways.

Accuracy:

It is not possible to work out the performance for every item in the Streamtime database: the condition is that time was both planned AND used.

Say Jeffrey was planned for 40hrs to work on the Design for a project, but he never recorded any time on it. Then it's impossible to assign him a score for that item.

Similarly, if he spent 40hrs on Design, but no one planned any time for that item (0hrs planned), there's no way to tell how well he did.

If Jeffrey was involved with 100 items, but only 55 of them had both time planned AND used, then the accuracy will show as 55%, meaning 45% of the work he was involved with was unplanned or not executed.

In the example below, Philip and Indie were scheduled to work on Copywriting. But Walter got involved instead. It is impossible to assign a performance score for anyone on this item for any of the users. It's the project manager's job to reassign time between users.

Solo vs Shared:

Streamtime lets you plan items in different ways. Certain tasks will be assigned to 1 person only, while other tasks are shared with multiple people. In the second case it is possible to assign time to each individual or to assign a global planned value for the team to share.

Art Direction is estimated at 27hrs total, but the time is specifically assigned as 2hrs for Jeffrey, 11hrs for Walter and 14hrs for wayahead. These are reported as solo in wayahead.

In the example below, Artwork is estimated at 24hrs and shared between Donny and Walter. This is reported as shared in wayahead.

If only 1 person was assigned to Artwork, the item would be considered 'solo' regardless of how the time was planned (on the item or for the person).

Staff Details

It's a tricky thing to go with the performance score without looking at some context. There may be many reasons why a person's score is good or bad. wayahead offers you as much context as possible to put the performance score into context. To get the details, just click a person's name in the Staff Summary table. wayahead will scroll down to the bottom of the page and show a number of extra metrics for each person.

Alternatively, you can click people's names just above this details area (so you don't have to scroll up to the table every time). You can use the mouse to scroll left and right through the names. They are colour coded by team.

Items Reported

This section talks about the accuracy of the user's performance and their overall score. wayahead found 67 items Walter ws involved with. Only 52 have both time planned AND logged, so 78% of tasks qualify for performance calculations. That's a 78% accuracy.

Overall, Walter used 15% less time than planned.

Solo vs Shared

The summary below shows you a breakdown of the person's items in tasks that were shared with others vs tasks this person worked on alone (solo).

Out of 18 tasks, 3 were solo (17%) and 15 were shared (83%).

What we see is that the solot tasks were all completed much faster than planned. The shared tasks on the other hand were overserviced by an average of 10%.

This is a common thing we see. Once the work is the responsibility of one person, they take it seriously and make sure they deliver on time. However, when the work is shared with (many) others, no one seems to take full responsibility to makes sure it isn't overserviced.

I typically compare this with sending an email to 6 people and not getting any replies. But send it to 1 person with 5 others in cc and you'll get your answer.

If you see the results balanced out like this across the board, the advice is to start scheduling time for each individual, rather than planning the time by item.

Breakdown by item name

This is pretty self explanetory. This graph visualises which types of items are standing out for the selected user. If a person is more efficient on artwork than art direction, you may decide to let them do more artwork, you can decide to give them more time on art direction or you may decide to get them a training course to get better at it.

Spread of items

The spread of items is a scatter chart that plots out all items assigned to a specific person. The horizontal axis of the chart shows how large the item was – the further the dot sits to the right, the larger the task. The vertical axis shows if the person used more or less time than planned.
In an ideal world, all the dots would sit around the 0hr variance middle line. But typically, the dots are scattered both above and below – sometimes you do a little better, sometimes you do a little worse.

But what we're looking out for here is item results that skew the average. In the image below we can see one task in the bottom right corner all by itself. We planned 133hrs of work, but this perosn only used 25hrs, a whopping 108 hours less than planned!

This could be a job well done. But it could just as well be a job that was canceled half way thorugh.

The point is, this one item, with a performance score of 18% of the time used is skewing this person's average score by a lot.

If the job was cancelled, it may pay off to go back into Streamtime and change the planned time to the 25 hours this person ended up using. We'll see their average performance score go down.

List of items

Finally, the report also offers a list of the processed data, so you can find out how a specific score came about.

IT shows the job and client name and item name. The Shared? icon shows if a task was solo or shared. Then you can see the Logged and Planned time as well as the final score for each item.

The total score for a staff member, client or item is the average of all the items in the summary.

Note that some numbers in this area of the report are underlined with a dotted line.

Click these numbers to filter the list of items by the clicked type. This allows you to filter the total list by all items, all reported items, solo items only or shared items only.

Did this answer your question?