We’re excited to bring Transform 2022 back in person on July 19 and pretty much July 20-28. Join AI and data leaders for insightful conversations and exciting networking opportunities. Register today!
Software engineering management platform jellyfish launched what it calls the industry’s “first comparative benchmarking tool,” a tool that allows technical leads to verify how well they are performing against other companies.
Jellyfish Benchmarks, as the product is called, is based on the company’s own internal data, which it collects and collects when engineering teams choose to share their anonymized data with the wider pool.
Founded in 2017, Jellyfish’s core mission is to align the activities of technical teams with the business objectives of companies. It does this by analyzing numerous technical “signals” obtained from developer tools such as problem trackers and source code management platforms, as well as project management tools. It’s all about identifying what teams are working on, tracking the progress they’re making, and how individual teams and employees are performing.
Entering aggregated, pan-industrial technical data adds more context to the mix, allowing companies to compare and contrast internal numbers with those of their peers across industries.
So, what kind of benchmarks does Jellyfish offer now? Users can access over 50 metrics, including time invested in growth; problems solved; deployment frequency; pull requests merged; coding days; incident rate and average repair time (MTTR); among many others.
“Importantly, Jellyfish includes benchmarking for how teams allocate or invest their time and resources – this helps teams understand how they relate to their time investment in innovation, support work or keeping the lights on, for example,” Jellyfish head of product, Krishna Kannan, told VentureBeat.
At the time of writing, about 80% of Jellyfish customers choose to share their anonymized data with the benchmarking datasets and only they will be able to benefit from this new product. To get a little, you have to give a little is the general idea.
“When Jellyfish customers come on board, they will be given the opportunity to leverage industry benchmarks based on anonymized datasets from other Jellyfish customers – customers who sign up will have their data anonymized and added to the benchmarking Jellyfish customer pool, Kannan said. “In the rare instances where customers opt out of this capability, their dataset will not be added, nor will they be able to use benchmarking as a feature.”
While software development teams arguably have access to more technical data than ever before, it’s not always possible to know from this data how well teams are actually performing on an ongoing basis – maybe they’re doing well compared to historical numbers, but still performing tremendously below par compared to companies elsewhere. This is the ultimate problem that Jellyfish Benchmarks is trying to tackle.
It’s worth noting that Jellyfish rival LinearB offers something similar in the form of: Technical benchmarks spread over nine measurements. However, the stats are presented as a static chart on the website rather than being incorporated directly into the platform to allow companies to compare themselves against industry standards on a percentile basis.
Jellyfish specifically targets dozens of metrics, opening up the tool’s appeal to a wider range of use cases.
“The reality we found is that different teams want to optimize different metrics depending on their product, stage, business goals and so on,” Kannan said. “That’s why we’ve included benchmarking for the metrics that matter most to our customers.”
The mission of VentureBeat is a digital city square for technical decision-makers to gain knowledge about transformative business technology and transactions. Learn more about membership.