Ask HN: How to Estimate the Development Effort for a Feature? Seeking Advice

3 points by alexliu518 13 days ago

Hi everyone! I'm a web developer looking to get better at estimating the development effort for new features. When we receive a request for a new feature, accurately estimating the time and effort required can be challenging.

I'm curious to know how you all approach this task. What methods or tools do you use to ensure your estimates are as accurate as possible? Also, how do you avoid underestimating or overestimating the effort involved?

I'd love to hear about your strategies and any advice you have to share. Thanks in advance for your help!

tra3 12 days ago

Have a look at "Software Estimation: Demystifying the Black Art" by Steve McConnell [0]

I really found this quote illuminating:

> When executives ask for an “estimate,” they’re often asking for a commitment or for a plan to meet a target. The distinctions between estimates, targets, and commitments are critical to understanding what an estimate is, what an estimate is not, and how to make your estimates better.

It's a pretty deep book that discusses everything around estimating:

- how to estimate and what things get missed during estimation

- why estimates gets missed

Really recommended.

[0]: https://www.amazon.com/Software-Estimation-Demystifying-Deve...

solardev 12 days ago

I like to try to build tiny MVPs (or micro prototypes) during the planning process itself. If someone hands me a big project, I'll take a few days to break it down into bite size pieces and try to make a super simple prototype for each piece.

For example, what are the user flows required (login, checkout, change settings, blah blah). How many screens does each one take? How long does it take me to build 2 screens and the connection between them? I'll try it, time myself, and then multiply that out.

For the API, what endpoints are required? What's the most difficult one, like auth? Try to implement a simple version of that. Do the same for another easier endpoint. Average them out. Buffer for shared infrastructure config time (all the CI/CD stuff, IAM, dozen Docker and yaml files...). But still try to build a couple small ones and then extrapolate from that.

Add in multipliers for adding tests, project management (especially if there's Agile bullshit and lots of Jira), QA, UI polish, accessibility, security testing, etc.

Whatever you end up with, add 20% on top it at least.

It's always a negotiation. If you come in with a too high number, it sets more realistic expectations and might get you additional resources. If you come in low, you'll just end up cutting corners and disappointing everyone in the end.

Document the hell out of everything and present it as a retrospective of the process itself, and use it to better improve the estimation and development process going forward (this IMHO is the real value, more than any one single estimation in and of itself). It lets you identify bottlenecks and "what I didn't know I didn't know" pain points that are different for every dev, ie what they ended up missing the estimates on by a lot.

Mostly the thing to keep in mind that the software estimation process is itself an algorithm that can be debugged, refactored, and improved over time.

alxmng 12 days ago

I find it helpful to first assess uncertainty.

1 = I’ve done this before

2 = Someone else in the organization has done this before

3 = Someone outside the organization has done this before.

4 = Nobody has done this before

1 and 2 can be quickly estimated by referring to past work.

3 and 4 must be broken down into small tasks that are estimated.

And when breaking things down into tasks to estimate, keep in mind coding is only around 1/4 of the work to produce software. There’s tests, documentation, revisions, planning, and communication.

  • AnimalMuppet 11 days ago

    For 4 (and sometimes 3), you need a mini-project for exploration. "After 2 months of work, then I'll be able to give you an estimate." Management never wants to hear that, but sometimes it's the truth.

Leftium 12 days ago

EBS: Evidence Based Scheduling [1]

I've never used EBS myself; only read about it...

A client required project estimates, but only gave vague requirements which constantly ended up changing anyways. And they always asked me to work on stuff outside the scope of the contract. (Help debug this; add that feature, etc)

Also they balked if I ever estimated anything over two weeks.

So what we ended up doing:

1. I give my best over-all estimate, but only contracted for two weeks at a time.

2. After two weeks, we'd re-evaluate the situation and repeat.

This worked pretty well for us.

---

[1]: https://www.joelonsoftware.com/2007/10/26/evidence-based-sch...

  • Leftium 12 days ago

    Another thing I observed was building something (at least) twice is a valid development process.

    1. The first time, the client hammers out their (often changing) requirements and you get a pretty good understanding of the problem that actually needs to be solved. So the main objective of the first build is actually just gathering the requirements. You also uncover issues that may have been missed just "thinking" about the requirements.

    2. The second time almost never takes as long to develop, so the development time to build that first version is a good hard upper-limit. It usually takes significantly less time because development of the second version usually involves copy-pasting a lot of the original code and just refining it.

genezeta 13 days ago

Mostly experience.

And unfortunately it's mostly an ad-hoc experience. That is, my particular experience won't generally be applicable to your particular situation. You have a different team, different preparation, skill level... You're using different tools, workflows, methods. Your projects and features can be different to mine. You get the point.

But fortunately it's an experience that can be fairly easily acquired and organized if you make the effort. You only have to measure and follow through on your estimations and their accuracy. You start making mostly educated guesses. Then you go ahead and do the work and compare the estimates with the time you actually took. You correct for the next estimate.

Loop through this several times and you generally end up with better estimates through each iteration. If you do it in an organized and consistent way, obviously.

Some of the keys for making this work are in no particular order:

- Break down larger tasks into smaller ones and estimate the small ones. It's easier to correctly estimate a smaller task. Also, the errors should generally be smaller.

- Following the previous point: Make sure you account for "hidden tasks". That is, you need to consider those parts of the work you just assume tacitly and make them explicit. Setup, infrastructure, bureaucracy, etc. You need to write down all tasks when you break down large tasks into smaller ones. You can, to some extent, account for this through an added safety margin, like multiplying by 1.2 or whatever. If you do that, I'd say start by multiplying by a larger factor (1.5 even 2) and in time, with the added experience and better ability to spot hidden tasks, reduce that factor accordingly.

- Involve the people doing the actual work in the estimation. Don't try one person doing the estimate and a different one doing the work. Also, don't let deadlines become estimates. I mean, you may not avoid a deadline but do a proper estimation anyway. If not for anything else, at least for the consistency of the estimation process.

- Give and use actual estimations. An estimation is not a single number, but a pair of numbers -sometimes even 3 or 4-. You may organize your estimations as a "duration + confidence level" (e.g. 3 days with a confidence of 80%), as a "lower and upper bound" (e.g. at least 3 days at most 5), or as "durations at various levels of performance" (e.g. if all goes very well 3 days, normally 4 days, if X thing happens then 6 days). Different styles work better for different people and circumstances. I like the last one because it helps bring up potential hidden tasks. But whatever you do, try to stick to the idea that a single number estimation is just useless.