🍩 Doughnut Reader 🍩

Libyear

libyear.com

Crier1002

116 comments

11 days ago

As a person who works on automated dependency updates (depshub.com), the libyear indicator is often not very useful. There are several other indicators to consider, such as release frequency, update type (major/minor/patch), the dependency's criticality for your project, etc.

Instead of solely focusing on reducing the libyear for your projects, a better approach is to minimize the steps needed to keep your project reasonably up to date. For instance, think about managing 20 PRs weekly to update various package.json packages versus 1 PR for critical dependencies when necessary.

It's important to note that updating dependencies is not a consistent task that can be done at the same pace all the time. Expect varying update volumes and complexities that may need attention at different times. Setting a fixed configuration for, let's say, 10 updates per week may not be effective, as it could lead to dealing with unnecessary updates regularly (e.g., aws-cli, which has almost daily releases).

Finding the right balance between keeping your project up to date and spending too much time on dealing with dependencies is the hardest part here that doesn't have a 100% right answer yet.

semanser

11 days ago

I've been doing this work for a while and been pushing my org to move from updating the "base layer" every 2-4 years (Ubuntu LTS versions) down to every six months (each NixOS release).

I think one thing I've increasingly found is that it's important to set up the infrastructure for parallel building— it's not realistic to have a flag day twice a year, and it's not realistic to try to test everything first "on a branch". If you can have a transitional period of a few weeks where the product outputs (containers, dev environments, whatever) are consistently available in both the old and new flavour, then you can invite people to try the updated thing, while still having an escape lane to keep using the older thing if stuff turns out to be broken in a way that's beyond their capacity to correct in the moment.

mikepurvis

11 days ago

I'm interested in knowing more about how companies typically gauge the freshness of their codebase dependencies. putting all the nuances/details aside, i think we can all agree that having a codebase with most dependencies that are over 8 years old is a pretty clear indication that it's way overdue for an update, right?

i was thinking, would it be helpful to keep track of how far behind each dependency is in terms of minor, patch, and major updates? but this seems a bit too complex to explain to the management. i'm trying to figure out the best way to explain this to management so they understand why it's important to stay current. any ideas on how we can measure improvements? maybe we should agree on a few key factors to track our progress and see if we're getting better or worse.

Crier1002

10 days ago

> I was thinking, would it be helpful to keep track of how far behind each dependency is in terms of minor, patch, and major updates?

This is exactly what I've added for depshub.com, and people seem to like it a lot. It just gives you better visibility across all of your connected repositories about what the current status of each dependency is and how the major vs. minor vs. patch ratio changes over time. While it's still a naive metric, it's the easiest to understand and visualize - and as a result, the one that is used the most.

> Any ideas on how we can measure improvements?

- Quantitative: Spend as little amount of time as possible on trying to keep everything relatively up to date (hours/month) - Qualitative: not having any CVE issues, not having major updates for core libraries and tools.

semanser

10 days ago

if it ain't broke

fragmede

10 days ago

Isn't "we are 5 major releases behind" more obvious that "two libyears".

> If your system has a one year old dependency and a three year old dependency, then your whole system is four libyears old.

How far down the tree do we go? Either fully, which means that one project with 365 one day old dependencies is 1 libyear old. Or not at all, as the rails example suggests, in which case if I have a wrapper around rails that I bump to an old rails version, anyone using my wrapper would have older rails but a fewer libyears?

There is no single answer to all of this, because it's too complex to boil down to a single number. But I think it's a bit odd to introduce a whole new thing that doesn't measure at all what's changed.

IanCal

11 days ago

I had the same immediate reaction that you did, but the key here is that the age of dependencies is not absolute but calculated against the latest release. So a one-day-old dependency with no succeeding version is 0 libyears old, not 1/365.24. Likewise for a decade-old release, if it is in fact the latest release.

It’s about the time delta between the used version and the latest available version.

eadmund

11 days ago

But this still means that you're either on the latest version or 39 libyears behind (if you count all dependencies of dependencies of dependencies...)

thih9

11 days ago

Adding transitive dependencies might be tricker, because then you can't tell what of your direct dependencies to upgrade.

It might be more useful to assign each direct dependency's transitives to it, and then you can do maths to figure out whether updating a direct dependency actually has an effect (e.g. 2 directs both with a 50 lib year dependency aren't worth upgrading unless you upgrade both of them.)

robertlagrant

11 days ago

I think each dependency is only counted once, even if it’s recursively included many times.

tlb

11 days ago

> There is no single answer to all of this, because it's too complex to boil down to a single number.

Having a single number that is a rough measure is still useful, though perhaps a (weighted?) average would be more useful.

Anyway, I just downloaded the Python tool (called "libyear" on PyPI) and ran it to quite quickly find three dependencies on my project that were over 2 years behind. That was helpful and I would use it again.

calpaterson

11 days ago

They're tracking different related things. I run a startup in this space and we track: aggregate libyear of your direct dependencies; total # of direct dependencies with libyear > 2; # of direct dependencies at least one major version behind; dependencies that have been abandoned by the maintainer.

I think the top-line aggregate libyear number is helpful to monitor over time to get a general sense of the slope of your technical debt. If the number is trending upwards then your situation is getting worse and you're increasing the chance you find yourself in an emergency (i.e., a CVE comes out and you're on an unsupported release line and need to go up major versions to take the patch).

Tracking total # of major versions behind gets at the same thing but it's less informative. If you're on v1 of some package that has a v2 but is actively releasing patches for the v1 line that should be a lower priority upgrade than some other package where your v1 line hasn't had a release in 5 years.

stevepike

11 days ago

It feels like it just has so many weird edge cases. A stable 2.3 branch that hasn't changed while the 1.2 branch has major security issues punishes you for not using the 1.x version.

A regularly updated 1.x branch for docs/security looks like you're doing fine even though the project is on 3.x and deprecating soon.

Perhaps as a vague guide to point to potential issues, sure.

IanCal

10 days ago

I think it only counts direct dependencies.

d-k-bo

11 days ago

So that means that the way to game this would be to create a wrapper project (e.g. called "external_deps") for all of my dependencies, and then have my actual project depend on that one. So I'm externalizing all of my tech debt onto external_deps and as long as I never make a new release of that, my main project will always be 0 libyears old.

falcor84

11 days ago

Even if you could, why would you? It'd be the same to, say, fork a projectX to myprojectX and say you're at the latest version.

You are the one that has an advantage to know you're behind, not someone else.

riffraff

11 days ago

Wouldn't it be easier to "game" the tool by just not using it?

alanbernstein

11 days ago

People already do that with vendorizing, too. It was noted the other week on HN of someone "collateralizing" their tech debt by vendorizing a dependency to remove CVE/deprecation/obsolescence warnings for a Rust dependency with no update path: https://news.ycombinator.com/item?id=39827645

A metric like this can't keep you honest (just about no matter how you design it, people will find loopholes), but it can help honest people document their needs.

WorldMaker

10 days ago

> So that means that the way to game this would be to create a wrapper project (e.g. called "external_deps") for all of my dependencies, and then have my actual project depend on that one

No, you can't resort to chicanery to manipulate your metric.

In this case, the libraries you want to use become transitive dependencies, but if your code uses those transitive dependencies, then your project still depends on them.

gwbas1c

11 days ago

Then the rails example on the main page is wrong.

IanCal

10 days ago

Look up Goodhart's Law.

PokestarFan

11 days ago

What's nice about the "libyear" concept is that it exposes the cost of indiscriminately pulling in 3rd party libraries.

IE, it discourages someone from sucking in a library just to use one tiny function that they could recreate in 10-15 minutes.

gwbas1c

11 days ago

That sounds entirely orthogonal to the goal, which isn't a good thing for a metric.

IanCal

10 days ago

Can't see this as being particularly useful. Libraries under active development will have larger more frequent releases (but small number of libyears out of date) whereas mature software with only occasional bug fixes may be 10s of libyears out of date.

barbegal

11 days ago

This also wrongly assumes that newer is better.

For example, CLDR changed the UK abbreviation for September from "Sep" to "Sept" and broke a lot of code as libraries used newer versions of the data https://unicode-org.atlassian.net/browse/CLDR-14412

rrwo

11 days ago

If you're on a 3 year old version of the library because the library introduced a change which you will never be able to adopt so you're forever stuck on the 3 year old version, you're in a much worse position than if you're just 3 versions behind because you haven't taken the time to upgrade yet. As such, libyears become an optimistic measure of badness in that situation.

mort96

11 days ago

What if the library new features aren't useful to your project and do not correct any bug you might hit in your use case?

prmoustache

11 days ago

If you're going to audit your dependencies sufficiently to know that then you don't need a tool like this anyway?

OJFord

11 days ago

A tool like that won't replace auditing dependencies.

The total age of dependencies tell you nothing useful.

rrwo

11 days ago

Nor did I claim it would. If you are auditing your dependencies like that then you don't need it, I said, as in it's not going to give you any extra information.

If you're not, and very many people are not, then total age of dependencies is a decent low-effort approximation for the probability of bug fixes affecting parts of dependencies that you're using.

OJFord

11 days ago

What if security fixes are useful to your project

mort96

11 days ago

I count security fixes with "bugs that you would hit in your use case".

I don't care about CVEs that only affect functions my app do not use.

prmoustache

11 days ago

Why are you in a worse position?

That depends on the changes to the library since, and how and where the library is used.

Suppose I regularly generate a CSV file, all ASCII, where all the rows are integers or fixed precision numbers. I have a ten year old CSV library that processes that file, and has worked without any problem for ten years.

I have no interest in updating the library. Updates can introduce downtime, but provide no improvement. In fact, they introduce a slight performance hit because of new features and that I don't need. There is also the risk that the updates will introduce bugs, and then I'll have to spend time diagnosing the bug, and coming up with a fix.

Now let me reverse this: suppose there are two libraries to do the same task, A and B. They don't have the same features, but for your use case, they are both easy to use and do exactly what you need.

A was first released in the 1980s and was last updated five years ago. It's still maintained and is available in most Linux distributions.

B was first released three years ago and has had 20 updates since, 18 of which included fixes for security issues that don't affect A. (The website for A is regularly updated to indicate that it has been tested and these issues do not affect t.)

Are you better off using A or B?

rrwo

11 days ago

> Why are you in a worse position?

Because, in general, as you drift behind, the friction of upgrading will increase.

You might not need to update today, but you're not in control of external events that may force your hand (sudden critical security vulns).

> Are you better off using A or B?

In this contrived example, it depends.

growse

8 days ago

I see the overall point as not seeing every dependencies as things that need upgrade.

Any library that is effectively a dataset could fall into this as well: if you want to freeze your environment at a specific reference point and only update the actual moving parts, the libyear measurement won't be for you.

This reminds me of interface softwares that keep old version of some libraries to emulate the original behavior, butnin a controlled and isolated way.

makeitdouble

11 days ago

On the other hand, it was only somewhat recently that CLDR acknowledged that languages with noun inflection exist, so it’s kind of a wash. E.g. in Russian, Ukrainian, and Belarusian (at least) you use the nominative of the month’s name in May 2024, but the genitive in 9 May 2024, etc., rendering most older allegedly-localized software that used a generic list of month names ungrammatical.

mananaysiempre

11 days ago

You might, but I didn't.

My immediate thought looking at this number was not that it should be minimized but that there ought to be a sweet spot range and a number below which it probably shouldn't go and a number above which it shouldn't go.

pydry

11 days ago

It's always context-dependent. Take Lisp languages. For Common Lisp, when I see a library that was last released or updated 10 years ago, I'm thinking it's probably as feature-complete as it's ever going to be, and otherwise perfectly fine. Same in case of Emacs Lisp? I'm thinking generations out of date, and has a solid chance of not working anymore. Here, it's a difference between a battle-tested, standardized (ANSI/ISO) platform (CL), vs. fast-evolving one (Emacs).

TeMPOraL

11 days ago

[deleted]

11 days ago

What was the reasoning behind that change ?

BlueTemplar

11 days ago

I've tried out some of the libraries, and it looks like they do calculate the difference between the installed version and the last (stable) release. If a dependency hasn't seen a release in ten years, those ten years don't count against the dependency drift. This is exactly what I would want.

However, they only check openly accessible (i.e. OSS) dependencies. If one of those hasn't seen a release in ten years, I would look for an alternative.

fdw

11 days ago

Agreed. If the dependency is under active development then it should be only counted as being behind if there is a newer version released for that dependency. The libyear should be calculated as "latest version's release date" - "currently used version's release date".

What complicates this is deciding whether the dependency is under active development or not. If its EOL'd then you still want libyear to accumulate, even if you use the latest released version. I guess comparing to an end-of-life date then would make sense, but it's probably harder keep track of.

planede

11 days ago

If a project is not under active development it may just be "done". How many minor version bumps per year does left-pad need?

__alexs

11 days ago

Maybe anything so simple that it can be considered "done" shouldn't really be an external dependency in the first place.

planede

11 days ago

Do you write your own JSON parser?

__alexs

11 days ago

Is there a JSON parser library that is not under active development?

planede

11 days ago

Agreed. Freshness is not a good metric to quantify the quality of a library.

Aldipower

11 days ago

Anecdotally the Python tool seemed to indicate for me a 0 for a dependency which was up to date (but hadn't been updated in a good year at the very least).

A more accurate (but more unwieldy to measure) metric would be to count the lines of code that have been changed since the version used and the most recent stable version. (I think this is what commenter amelius implied?) It wouldn't quite capture the nature changes made, but it would very much uncouple from the quite unwieldy assumption that libraries are all developed at the exact same pace.

yunruse

11 days ago

I don't think that lines of code is a good metric here. A few lines of code can fix a major security issue in parts of a dependency that you actually use, while thousands of lines of code can just add new features that you are not using anyway, otherwise you would have upgraded already.

planede

11 days ago

Well, it's up to you to interpret the libyears how you like. It doesn't have to be that more libyears is worse, but it will mean that you're missing X libyears of security updates (and also X libyears of potential new bugs).

stavros

11 days ago

Well, no you can't. Adding them together implicitly affirms that libyears from different dependencies are equivalent.

edflsafoiewq

11 days ago

Yes, that's the point of the metric. The observation that libyear is a dimension, libyears are additive.

If you value some library years more, and some less, then weight the sum.

It's like saying there is no point of natural numbers, because when you count apples, some apples might be rotten.

bmacho

11 days ago

It's saying you can't add different units together.

edflsafoiewq

10 days ago

Yes it does. And you can say "this project has more libyears, so it's more mature".

stavros

11 days ago

That really doesn't follow. A barely maintained mess of a personal project running for 10 years is not more mature than a 7 year project heavily developed and released by a high quality team, with 3 years of stability in the main API and in use around the world.

IanCal

11 days ago

Are you saying that libyears is a pointless metric? That's what your comment implies.

stavros

11 days ago

Broadly, yes.

IanCal

11 days ago

The linked website does not explain what a libyear is. It gives two examples:

> Rails 5.0.0 (June 2016) is 1 libyear behind 5.1.2 (June 2017).

and

> If your system has a one year old dependency and a three year old dependency, then your whole system is four libyears old.

which don't explain much.

I suspect what they meant to say is that Rails 5.0.0 (June 2016) is 1 year (not libyear) behind 5.1.2 (June 2017) and that the "libyear age" of a project is the sum of how old each of its dependencies is. But if so, they should say so clearly somewhere on their page.

umanwizard

11 days ago

I guessed the same thing but didn't even notice it was unclear, I just assumed my guess was probably right ...

The concept really does seem obvious, especially since it sounds like man year, but it needs better documentation.

eternityforest

11 days ago

What confused me was saying that some version of Rails is some number of libyears behind some other version, when in order to know that you'd need to expect the dependencies of Rails itself...

umanwizard

11 days ago

Just remember that blindly updating dependencies that no one trustworthy has reviewed is opening yourself up to supply chain attacks.

Blindly upgrading is worse than never upgrading unless you are addressing a specific CVE that impacts you.

Public open source code is code you did not have to write which can be a time saver, but you do not get to skip code review.

If you do not have time to review 2000 dependencies, then you should drop them favoring simple functions that only do what you need.

lrvick

11 days ago

[flagged]

Culonavirus

11 days ago

Shameless plug:

If your team really wants more dependencies than they have the capacity to review, my team and I are happy to help greatly reduce this risk.

My company (https://distrust.co) has reviewed hundreds of dependencies on behalf of our clients, and some even pay for this as a monthly retainer service.

JS dependency debt is good for my business.

lrvick

11 days ago

It's just an absurd idea that your typical, average "modern JS" devs using React + Redux/Next.js (with a truckload of other deps), for example (as is pretty common these days), will review (or wait for a review of or pay for reviews of) the dep tree. People have enough on their plate, I'm willing to put my life on a bet that 90% of JS devs don't review their deps. It's preposterous to even suggest that. Just look into your node modules of these projects. This is done in maybe the top 10% most rich AND security minded businesses out there, mostly in the FAANG or the banking/govt sector. Individual devs or small teams? Forget it. Never worked at a place that did so or even heard of one.

It may be different with other languages and frameworks that may be moving more slowly or the dep tree is much flatter, but JS moves fast, browsers change often, and the dep trees are massive. Bugfixes and small feature upgrades could easily change a truckload of dep code.

Then, as if it wasn't enough, I'm not even sure how would you review something like an Electron-based app... what, you're gonna review the entire Electron-Chromium-Node chain on top of all the mess mentioned above? Sure :D

Culonavirus

11 days ago

I would double down on your claim and say 99.9% of JS devs don't review dependencies. Also 100 years ago almost zero doctors washed their hands. The majority of people in a profession doing something does not make it any less negligent.

My customer base is almost exclusively companies in industries like Fintech where giving strangers the ability to push any code they want to production is undesirable.

In one case, the customer has their own internal team review all dependencies -and- has my team review them, and only adds a dependency to the allow list that get a thumbs up from both.

lrvick

11 days ago

What guarantees do you offer?

playingalong

11 days ago

We are not an cyber insurance firm, so we don't offer any guarantees or compensation if you get compromised. We operate the same as any security auditing firm.

One of our clients reviews all their dependencies in-house with their own team and has us be the second pass for extra assurance. They have also tested our team in the past by not revealing vulnerabilities they have already identified to be sure we are indeed reviewing the way we say we do.

Security bugs are everywhere because very few internet rando library authors have infosec experinece, so you are unlikely to get a clean report on any given dependency tree, but /usually/ the items are low to medium risk and can be chased upstream or easily patched out.

Sometimes however we find really problematic ones like CVE-2018-9057 that impact you if you have ever used it even once ever.

Also there is an economy of scale here in that if one of our other clients asked for review of a react dependency last week, we can copy over our opinion and consume fewer hours on the clock.

lrvick

11 days ago

libyear assumes that constant development happens all the time in a library and software has to change and grow constantly. There are some libraries that are just mature and doesn't need to change. 5.0.1 could be released 5 years ago, and 5.0.1 today with just changes to docs. It doesn't mean that it took 5 libyears to develop and release the new version. This is the type of thinking that attackers used in xz trying to pressure original author to add more official maintainers, because he wasn't merging and updating code fast enough. "You're not merging and releasing our code fast enough, therefore you're doing bad job".

What would be other measures that could be similarly useful? Lines of code or story points? Maybe even a number of tests added?

agilob

11 days ago

That is an argument for libyears, not against it. If they only updated the docs, why didn't you upgrade already? If anything, projects using stable libraries like this could very easily reach 0 libyears, because there are no breaking changes when upgrading.

quassy

11 days ago

It's for and against at the same time. Depends on your software upgrade roadmap. I think that's just a silly number, that us correct and incorrect at the same time, but probably good enough on a large scale, just like story points.

agilob

11 days ago

I went through this project before.

For some applications it might be of great use but for a vast and complex applications architecture, the libyear metric might only oversimplify the complexity of dependency management,compatibility issues, updates and security patches, etc

I noticed that it focuses only on the age of dependencies without considering other factors like the how critical is the update, and how stable it is, and the improvements in newer versions, etc.

avi_vallarapu

11 days ago

It's a tradeoff between having a simple and easily quantifiable measure vs a more subjective, more complex and more accurate measure.

Libyear seems like a decent starting measure if there is no appetite for something more in-depth, IMO and YMMV.

liampulles

11 days ago

> Libyear seems like a decent starting measure if there is no appetite for something more in-depth

Maybe, but couldn't measuring, and thus reacting to, a bad measure be worse than doing nothing?

KingMob

11 days ago

Sure, it's almost like you need to exercise judgement when selecting metrics and planning your work.

verve_rat

11 days ago

Right, but that means it's not a decent starting measure.

To me, "decent starting measure if there is no appetite for something more in-depth" sounds like "just drop it in, it's enough to get started, we'll figure out the rest later", but that could be temporarily harmful.

Exercising judgment is the opposite of that, no? Then you're going into depth.

KingMob

10 days ago

At a previous company we had this big web-app (filterable) matrix which listed each project dependency, but the neat thing was that you can tag dependencies to add weight and importance.

Initially i thought it would need to be more complex, but but it was more than enough.

Raed667

11 days ago

What's nice about the "libyear" concept is that it exposes the cost of indiscriminately pulling in 3rd party libraries.

For example, I recently went through a project to bring 3rd party dependencies up-to-date. I noticed that we were using a very old mathematics and statistics library.

On closer inspection, we were only using one function from the library to calculate the mode of a list of numbers. Looking at the library's source code, it was about 50 lines.

Now, a decent programmer can recreate such a function in 1-3 hours, including a unit test. This is what we did instead of including the dependency.

A naive programmer might think that sucking in the 3rd party library "saves" 1-3 hours, but that's not the case: Every few years someone will audit the libraries we're using as part of a security audit, or a legal audit to make sure that we're in compliance with open-source libraries. The stats library will incur a 1-2 hour cost in each audit.

Furthermore, every 1-4 years we'll need to update the library, because changes in the programming language, runtime, OS, ect, mean we'll need at minimum a recompile or similar tweak to take advantage of some new language feature or constraint. The 3rd party dependency could add 1-4 hours to such a project.

Thus, because libyear shows an increasing cost associated with the library, it's easier to explain why it's better to spend 1-3 hours writing a simple function (and unit test) than to bring in a 3rd party library to do the same thing.

gwbas1c

11 days ago

One step away from rediscovering voltime.

Time flows faster in periods of high volatility and slower in periods of low volatility. Instead of measuring time directly it should be adjusted by things like changes committed, LOC added/removed, CVEs opened/closed, etc.

baq

11 days ago

Matters a lot what kind of applications the library is for. Being a couple of years behind in mobile application development usually means you have to spend a week piecing together a development environment to get the crap to build so you can start the library upgrading process.

A couple of years hither or dither with grey-haired Java libraries matters very little. There might be some vulnerabilities but you probably know about them and have workarounds, and sometime next year it's likely you'll be allowed a month or two to do 'life cycle management' in the dependency stack.

cess11

11 days ago

It's nice to have a name for this, and for it to be quantifiable. I could see this on some kind of product dashboard - maybe automatically generated.

But, while I appreciate the need for simplicity, I also wonder if it would be wise to scale dependencies by how prevalent they are in the codebase. For example, if I'm using a five year old version of react but the library I use to convert temperature units is up-to-date, then thats bad. But if I'm using the latest react and the conversion lib is old then thats less bad.

Probably feature creep though...

andyjohnson0

11 days ago

The features that you described are somewhat close to what I'm trying to build with depshub.com. Dependency visibility is still a major problem in any engineering team that cares about dependencies, and it's often very hard to say if a project is moving in the right direction in terms of updates. Some teams just completely ignore the fact that they need to update dependencies, but this usually comes with the consequence of "updating ASAP because we need X feature or Y bugfix."

All the major tools (dependabot, renovate) to keep dependencies up to date treat all the dependencies equally when in reality there are always core libraries (e.g., react) and everything else. While trying to keep *everything* up to date is extremely challenging, what I'm trying to do is to find a balance between what and when needs to be updated (using code static analysis, different data sources, AI etc) and automate it in a simple manner.

semanser

10 days ago

They've succeeded in creating a single metric that's easy to calculate, but IMHO, it fails to be very useful for common use cases.

Basically, it just uses the difference between the date the library version you are using was released and the current date if there's a newer release available.

Eg, if you are using a library that has been unchanged at 1.0.0 for the last 10 years, you'll be 0 libyears behind that whole time. Then one day, the developers of that library release 1.0.1. One minute after that hits the package repositories, you are immediately 10 libyears behind.

This makes it pretty useless as a metric for tracking how outdated an application really is. Eg, as an ops/SRE/security person, I'd want to be able to run this on a product team's code and have a single number that tells me whether they're reasonably up to date or seem to be ignoring their dependencies and letting technical debt pile up. A team could've been on the ball, keeping every dependency updated daily for years, but if I use libyear to evaluate them right after that that 10 year old dependency updates, it's going to look like they've been negligent.

I have an open issue on the Python implementation (which ironically(?) hasn't had any commits in three years) asking for clarification: https://github.com/nasirhjafri/libyear/issues/35

thraxil

11 days ago

I'm the developer of depshub.com (for automated dependency updates using AI) and even though a single metric isn't valuable, having any sort of indicators and metrics is very useful when you have more than one repository. Being able to quickly see if your repositories are getting better or worse over time helps to understand when the dependency updates should be prioritized (if so) in the first place. There are a few core metrics that I've built (major vs minor vs patch ratio, security updates, etc.) into the product, and it's one of the most used features up to date.

semanser

10 days ago

This is a hugely misleading indicator to rely on. Not only it sums up all the time behind in dependencies which is more than confusing to start with but it also leaves the end user puzzle what to do with this information.

Let's assume my software project is 120 "libyears" behind. What's next? What risks am I exposed to? What should I do?

Think of a notorious python2 vs. python3. I am in 2019 and my software project has it as a dependency. My team has assessed that migration to v.3 will require another year of dealing with all the breaking changes. And while brainstorming we are thinking from the risk and cost-benefit perspective. Time per se is relevant only in the context of effort required to perform the migration.

From the supply chain security standpoint I could not care less about time as well. If I am using library X of version 1.2.3 and it ticks all the boxes, has no performance impact, has 0 problems, 0 vulnerabilities (including the results from public, third party and internal code audits) I will continue using it even if version 2 is out, especially if it requires reassessment of risks and some code refactoring due to breaking API changes.

If I want to automate my dependency management I will rely on tools that will tell me about my risks or potential missed benefits from the newer versions. Time will be taken into account only in terms of time needed for mitigating the risks directly impacting my piece of software.

nrvn

11 days ago

> If I am using library X of version 1.2.3 and it ticks all the boxes, has no performance impact, has 0 problems, 0 vulnerabilities (including the results from public, third party and internal code audits) I will continue using it even if version 2 is out, especially if it requires reassessment of risks and some code refactoring due to breaking API changes.

What happens if the library that you're using is completely fine on its own (think React 18) but it's a core cross-dependency for tons of other libraries in your project. No libraries or frameworks should be considered in isolation. Otherwise, it can lead to a situation where you can't use some of the other tools/libraries, etc., because of the other dependency that is quite out of date.

semanser

10 days ago

It's worth also reading https://chaoss.community/kb/metric-libyears/ - as noted elsewhere, regularly updating libraries, or a library that infrequently pushes out large breaking changes will not reflect as easily in Libyear, but it's worth having it as something to gauge out-of-date-ness

I've been using that alongside some other metrics for providing insights into how behind teams are on updates

jamietanna

11 days ago

> I've been using that alongside some other metrics for providing insights into how behind teams are on updates

What are some other metrics that you are using? I am working on a product that is helping to keep dependencies up to date and would love to integrate some of these things in the product.

semanser

11 days ago

The explanation could be better, but I really like the idea.

It punishes you for not updating your dependencies and for having too many direct dependencies. But it doesn't punish you for indirect dependencies (that you have little control of), or libraries that are "done" (since it compares to the newest stable release, not the current year). A sensible balance.

Maybe one could write a browser extension to display the libyear of GitHub pages?

BoppreH

11 days ago

Can we also have libloc (metric based on #lines of code)?

amelius

11 days ago

Behold, you will see a single-line library soon! (Depends on the language support.)

lifthrasiir

11 days ago

Don't tell the corporate about it, but using charcount/80, excluding newlines and whitespace is the _improved_ pseudoscience.

Additionally, excluding 'imports', namespacing, and other boilerplate helps too.

impulsivepuppet

11 days ago

Don't make me wave the Diffusion of Innovations chart at you all:

https://en.wikipedia.org/wiki/Diffusion_of_innovations

libyear is an opinionated metric that prioritises less well tested software. Meanwhile, companies pay a lot of money for RHEL and other products that promise a stable environment that freezes specific (major) releases of software for years - and also promises backports of any necessary security fixes, without those pesky new features and breaking changes that come with using bleeding-edge releases.

Different people, projects, organisations, all have different risk appetites. We need all of them working together; late adopters wouldn't have the stability they crave if early adopters didn't exist to test the crazy broken fresh software.

While everyone needs to manage dependencies, there's no one right way to do it, so everyone does it their own way. They only thing we can probably agree on is doing _no_ maintenance on dependencies is a bad thing.

amiga386

11 days ago

Without those pesky new features and breaking changes that come with using bleeding-edge releases.

This is usually a popular counterargument when people are talking about keeping everything up to date. What people should consider though is to try to keep everything *relatively* up to date, without always being on the latest version but still not very far away from the latest release.

GitHub, Stack Overflow, etc., are full of data about potential issues when updating to library X to version Y, and usually, you're able to find this when it's too late - either you've got an error in production or you're in the middle of an update and you discover that there are some issues with the version that you want to use.

Exploring these data points is still a pretty much untapped area, and this is something that I'm trying to explore with my product that updates dependencies automatically in a more "smarter" and autonomous way at depshub.com.

I would be happy to see more people working in this area since it's clear that there is a problem that needs to be solved and unfortunately the current status quo is "while everyone needs to manage dependencies, there's no one right way to do it, so everyone does it their own way."

semanser

10 days ago

This is a fantastic way to encourage churn and ensure developer job longevity.

noobermin

11 days ago

A great metric for job security, not necessarily for better software: Unless it concerns security or correctness, I prefer an old but audited version of a library anytime over a newer version that I have to audit again.

A metric like this will be loved by PMs and loathed by developers who have to leave a known, sane state, update and deal with the fallout later on.

jlg23

11 days ago

While granting that libyear is clear in what is measures, I still think it measures the wrong thing. What should we be measuring?

I have some ideas for my projects, but I don't have the answer for your project.

Semantic versioning ain't the only game in town for sure, and I'm not anchoring on it as the best or only way.

But I will say this: when one has figured out what is important to measure, build metrics for that. You almost certainly will need to factor in supply chain security. And probably some metrics for recency about the hardware platforms you deploy to. This could look like a weighted score, perhaps. But it is unreasonable to hope that libyear or semver to do that for you.

xpe

11 days ago

This assumes that code that just works incredibly well and hasn’t needed an update in years is inferior to frothy untested code or broken code constantly requiring bug fix PRs.

api

11 days ago

I agree with other comments that it's not a perfect measure but it's a solid step in the right direction from having no metrics at all.

The libs we're measuring up to could have their own libyears to upgrade, but we can only control what's in our hands.

Sometimes a small security patch is worth more than a major version bump of features, so I consider measuring the time instead of major versions a benefit.

tomaszsobota

11 days ago

I guess that having to rely on metrics to begin with means the battle is lost and your have no control on the code you are using.

Maybe we should stop boilerplating everything and write the actual code we need. For the most part softwares usually use a tiny fraction of capabilities of any given library.

Maybe before trying to limit our lag in dependencies update of unlimited levels of libraries we should focus first on having a maximum level of dependencies. Like one project would use a maximum of 2 level of libraries dependencies and you would have to rewrite those that have too many levels.

The javascript ecosystem for instance is totally unmanageable as I see it. We just pretend we have a bit of control but in reality nobody knows what code is executed really and this is sad.

prmoustache

11 days ago

That is better, I agree. I'd take a lean set of dependencies any day, but it becomes increasingly more difficult the more velocity the project gets.

Suddenly less and less is considered core and it's easier than ever to 'outsource' to external libs to save time. Or is it rather that the project gets more velocity because of that?

> We just pretend we have a bit of control but in reality nobody knows what code is executed really and this is sad.

True, this is also slowly starting to be the case with other languages. With Python it can be so bad that even attempting to 'build' and run the same project a year later may well fail. Much to what I'm used to with JavaScript projects by now.

tomaszsobota

11 days ago

I’ve been wanting to make a similar indicator for the work I do at endoflife.date. IMO, a realistic upgradability metric would be linked to the “hardness of the upgrade to a supported version”, which is much harder to answer and quite contextual in how you are running each dependency.

But libyear is a good metric to have as prior art in the field.

captn3m0

11 days ago

If you're using libraries from 10 years ago, maybe it's because you used good tools.

derrida

11 days ago

I propose a different metric, version points: subtract version number of the currently used library from the version number of the latest version. Translate to semver first if possible. Also, a release addressing a CVE adds extra ten million points.

thih9

11 days ago

This is such a good idea. Another one I would like to have is a single number repressing legacy code. I'm not sure exactly what but is there a way to represent the code which people are writing workarounds for.

newswasboring

11 days ago

There are tools[0] that show you which users touched which files/modules and how many of those users (if any!) are still in the company.

Doesn't necessarily tell you what code is legacy – perhaps a function is just so solid, that there was no reason to touch it in years. But I've found such analysis helpful and it can give you warning signs about what knowledge is being lost in the team and which parts of your own codebase became unknown territory.

[0] I know of CodeScene but suppose there are others

yreg

11 days ago

As a worthless junior dev, thank you for the post. I am seeing a general sentiment of reluctancy towards an introduction of yet another dubious metric in an environment where "software quality" is hijacked to mean something else.

This lead me questioning how good is it to judge a project by its age + last commit (+ project size/complexity + funding/community), as this is what I do in practice. I agree that SemVer isn't really designed to be human-readable and is a rather meaningless / deceiving metric due to divergent practices of different developers.

impulsivepuppet

11 days ago

Very rarely do I so vehemently disagree with a particular argument in software. This idea epitomizes much of what is wrong in the industry.

We can all agree security updates are essential, but a lot of libraries are “done” from a functional perspective for a majority of their existing use cases.

Yes updates can be needed because interfaces break between other programs, standards evolve in backward incompatible ways, performance improvements can be made, etc. But much of the updates I see are changes for the sake of changes.

You could use a 5 year old version of React for example, and modulo some set of security fixes if any, you could have a robust application.

Sometimes software is just done. We are better off for accepting that idea. Get us off the update hamster wheel and stop the enshittification.

sudo_bang_bang

11 days ago

I just added libyear support to ocicl for Common Lisp.

atgreen

9 days ago

The metric should be called libtime, and measured in pcyear.

bmacho

11 days ago

So, how many libyears were that xz dependency out of date?

Libyears are meaningless. A library either has known vulnerabilities or it doesn't. When it doesn't, old is often better than new one.

thriftwy

11 days ago

Super idea :)

betimsl

11 days ago

[dead]

pomoke

11 days ago

[deleted]

11 days ago

[flagged]

gamegod

11 days ago

nice metric!

nikolayasdf123

11 days ago