The moment GitHub moved from Rails to React is when the spark went out for me. The web experience doesn't feel nearly as good. Perhaps from the offices in SF it's indistinguishable to the old SSR architecture, but every interaction feels like a trip to a geostationary satellite when using it from TX.
I simply need my tools to not run like shit. There are things that you cannot do with GitHub anymore that you once were able to. For example, reviewing 500 file PRS might be an "anti-pattern" in HN's mind on average, but there are a lot of real-world scenarios at actual businesses where this comes up and there's literally nothing you can do about it other than have tools that don't suck at the job.
I've been working on a self-hosted GitHub alternative in my spare time. The goal is to have a lightweight public instance for discoverability and community over reasonably sized projects and to also offer a freeware download of the same for private and other self-hosting needs. The entire focus is on the simplest possible thing. Absolutely zero energy into AI, LLMs, client-side rendering circuses, etc. All that really matters to me is code/issues/pulls - primarily in B2B SaaS contexts. I'm just trying to run a small team and manage a codebase for 1-100 customers. I don't need to run an entire war and I certainly don't need my tools to entertain me with an endless stream of shiny distractions.
If you force push a new init commit then the previous commits are still available on github if you know the commit hash like github does right? At least in a PR you would see a clickable old commit hash.
In my experience, no they won’t help with gdpr takedowns. The only way to make things unavailable is actually to file a dmca notice against any url that you want hidden. This was actually the recommended approach from GitHub when I asked them about this. Absurd.
> The only way to make things unavailable is actually to file a dmca notice
Is it costly to do?
> In my experience, no they won’t help with gdpr takedowns
I would have expected that I could say "This is my code, hence this is my data, and I want you to remove my data from your website". I wonder how hard it is to file a complaint to the EU and see what happens.
I'm all for self-hosting like this or using less known services.
But what about discoverability? Is there any good search engine for code? These days I often find more interesting projects when searching code on GH rather than by name or description on repos.
> I'm all for self-hosting like this or using less known services.
I'm all for the idea of self-hosting in abstract, but in practice I don't want to spend my life looking after services.
I'm all for the idea of growing your own rice, but in practice I'd rather pay someone else to do it for me.
I'm not trolling, I'm just emphasizing a super super important aspect which HNers tend to minimize, which is that self-hosting is work and even people who are technically able to do it might not want to.
I think its like 3 years that i put the upgrade commands into a cronjob and stopped looking after the server. It hasn't broken so far & is up-to-date. I think it only works with dumb and simple software stacks, though.
I suppose that the TFA author is lowering their load by not having to respond to inane issues, which is also work.
Otherwise, yes, non-essential things are best delegated, especially when it's free. Some essential things also have to be delegated, if doing them yourself takes too much (like growing rice). The latter works best when the essential thing is a commodity with a wide choice of suppliers (like rice).
Past the initial setup, really there's not much to look after outside of the odd issue here and there which can usually just be a "go restart container" solution.
In the age of Docker & co., people really overestimate how much effort it is to self-host a service. Probably it’s the entire generation that never ventured outside using AWS managed services.
In practice, if you do not require 99.99% uptime SLAs, all you do is connect every 6 months to do a docker pull and update the image.
You can setup mirroring to a github repo, with a repo description "This is a read-only mirror of <other non-github repo>". People find the project on GitHub and from there are directed to your self-hosted Gitea/Gitlab etc.
One issue that comes with leaving GitHub is a higher barrier to contributing. The author appears to see this as a nice filter, but it may not make sense for you. With a self-hosted forge, a new contributor will need to:
a) Sign up for an account in your forge: Do contributors really want another account? Does your captcha/email verification actually work (I've encountered ones that don't)? There are also forges that require you to ask for an account which is another hurdle.
b) Send an email: Configuring `git send-email` is alien to many contributors and may not even be doable in some corporate environments (OAuth2 with no app passwords allowed). Diverging from this is error-prone and against social norms which the contributor may not even be aware of (until they get flamed in the mailing list). You are also giving up automated CI which is a big part of the contributor feedback loop.
To be clear, going independent does indeed work for small personal projects (do not care much about contributions) as well as established ones (large incentive for new contributors to jump over hoops), and I'm fully aware that a lot of HNers do not see the need for those "niceties" provided by GitHub. But I feel that people often underestimate the barriers that they are putting up.
I believe the slightly higher barrier is a feature, and a good filter for low quality spam.
On the other hand, if I spent time and effort writing a patch for public release, I have no issues jumping through hurdles to see it published, whether I have to create an account or learning the correct incantation to git send-email. Usually the thing that stops contribution is finding the time and will to prepare a PR for review; in comparison to that effort, creating an account is trivial.
The way I see it, using a distributed VCS like git benefits from having a distributed ecosystem. Putting everything in Microsoft’s hand for them to train their commercial AI product on your code is a little reckless and short-sighted. And we could do with fewer siloes.
I have had my ebbs and flows with github. I think one of the difficult things about moving away from it, is the loss of network effects for open source projects. If you have no public repos or don't care about contributions or discovery, this doesn't matter.
I know the author is accepting patches, but even learning that their forge exists, and that the user wants to contribute is stymied if the users search for projects starts at a github search.
Go by number of downloads? skellock's extension wasn't updated in 5 years, is marked archived on GitHub, and doesn't accept any fixes (and there's a handful of reported problems that could use fixing). So while its README suggests that it once had a lot of effort put into it, it's not the choice that will grow with you.
kokakiwi's has nextmost downloads, but its website and git repository is self-hosted on a site that was gone since December 2020, so also 5 years of staleness. I suppose that's another way to archive your extension.
nefrob's has fewer downloads yet, but got a single 5-star review (Open-VSX doesn't have a lot of active users), the GitHub repo was updated 3 months ago, and it seems alive. Also, the parser itself seems simplistic and admits to not get things perfect.
wolfmah's has been inactive for more than one year, and it contains a single commit.
It was even less obvious with Typst: There are 12 results for the keyword "typst", and the leading extension didn't have many downloads at all -- I can see now that it's #2 for downloads.
Altogether, downloads / stars / reviews are a great way to get an honest answer in many cases when there aren't commercial interests at play, since there are fewer incentives to game metrics.
Those are the absolutely worst examples you could provide. Every single one of those companies has been in the headlines for how demonstrably worse they have gotten over the years, even being pulled before congress in some cases.
I mean, if the opposition is GitHub training on your stuff - or any AI, for that matter - self-hosting doesn't save you. This feels like a classic "tech thinking it can solve a politics battle". If you're making a self-determined ethical decision then I get it though.
I also don't get why you wouldn't go for something like Forgejo/Gitea, or if you're really dead set on the email pattern like the author seems to, Sourcehut.
I think the sourcehut guy is not the kind of person I would rely on. I followed his blog for a while and now I've stopped.
I moved my personal stuff to codeberg. There's also the advantage that fewer people have accounts so I get fewer pull requests that change thousands of lines and fail every single test.
Of course I could be wrong, predicting the future is difficult.
The codeberg.page domain, which projects can use to publish their documentation, ended up in some spam list so to link to ltworf.codeberg.page in my email signature I must link ltworf.github.io which then redirects to the real one.
Imagine the noise if github ended up on a spam list.
It's literally far more trivial than you may think. (At least if you already own a VPS or a dedicated server.)
Basically, (1), add a git user, (2), configure .ssh/authorized_keys, and that's basically it!
To add a bit more security, you may also want to edit /etc/shells and add git-shell as a login shell, then change git user to use that shell, but that's entirely optional (and it'd actually make it less trivial to create add extra repos, so, for a single-user repos, you may want to delay doing that).
In all, the entire set of official instructions can be printed on like a single double-sided Letter / A4 piece of paper.
P.S. Actually, I guess it's 3 single-sided pages, but that's only because you'd be printing the example SSH key twice, which takes a quarter of an A4 each time, thus, the entire official instructions take almost 3 single-sided pages total.
Private repos: git init --bare in your homedir on the server, and then push/pull using user@hostname:path/from/homedir/to/repo.git as remote url.
Optionally, public repos: Set-up git-daemon in some /var directoy and do the git init dance there. Also touch the git-daemon-export-ok file to whitelist each repo for public serving. chown the repository directory to your own user so you can write there freely, while git-dameon can read and serve the contents to the world.
git works over ssh. If you have an account you can ssh into you have a git server. That's literally all the git@(whatever) is under the hood that you set your remotes to.
Everything is just dumping UI over a file system and executing command line stuff.
If you want that too take a look at forgejo or gitlab, both of which I use for self hosting. The setup is similar to setting up any other server that requires user accounts, a database, and exposed ports.
I’ve been running rgit, which is a nicer frontend than cgit. If you really want SSH, you can either use a full fledged server like Gitea/Gogs/Forgejo etc or run something on top of your existing SSH daemon like gitolite, which manages authorized_keys for you.
This is also a good reminder that you technically don’t need a “server”. If you and your friend want to collaborate, you could in principle pull from their computer (and vice versa), if you have (ssh) access.
True. Git works over ssh natively. you set up a repo in your homedir using “git init —bare”. Say in a repo subdir. Then doing git clone user@server:subdir will clone that repo on another machine and allow push and pull over ssh.
A standard git package includes gitweb, a web server for your repos. For a more feature-full web server, you have choices such as forgejo. As others have said, you don't need a server, ssh access is all that's needed.
I beg to differ as those, by themselves, will not give you a server as was requested. The examples upthread all include additional software, such as sshd, gitea, etc, for a reason. necessary != sufficient
My question was posed because I believe the post did not read the original request carefully.
If you really think you can claim "every machine with git installed has an ssh server enabled" (and is somehow linux to boot) and that you have convinced yourself "that makes sense" I don't know what to tell you other than your experience is shockingly narrow.
I am specifically talking about Linux, did you misread?
My experience is managing over a hundred thousand vms at once across multiple countries, and have managed large fleets of tens of thousands or hundreds thousand+ for decades. This is old hat.
My desktop runs Linux. My laptop runs Linux. My phone runs Linux. The phone is the only one without a running SSH server. I'm pretty sure the openssh server is also a default package in Fedora Workstation, it's just got the service disabled by default unless you flick on a check box in the Gnome window.
Windows Server also supports SSH, and has several shell options.
I couldn't answer for any Apple device.
I guess my experience is narrow, though.
Hey, how many git servers do you know not running Linux? That's an interesting question!
Edit: Curious what you think both GitLab and GitHub run on, and if those have ssh enabled or not.
> A deliberate misspelling of lead, originally used in instructions given to printers to indicate which paragraphs constitute the lede, intended to avoid confusion with the word lead which may actually appear in the text of an article. Compare dek (“subhead”) (modified from deck) and hed (“headline”) (from head).
Further:
> In 1990, the American author and journalist William Safire (1929–2009) was still able to say: “You will not find this spelling in dictionaries; it is still an insiders' variant, steadily growing in frequency of use. […] Will lede break out of its insider status and find its way into general use? […] To suggest this is becoming standard would be misledeing […] But it has earned its place as a variant spelling, soon to overtake the original spelling for the beginning of a news article."
The moment GitHub moved from Rails to React is when the spark went out for me. The web experience doesn't feel nearly as good. Perhaps from the offices in SF it's indistinguishable to the old SSR architecture, but every interaction feels like a trip to a geostationary satellite when using it from TX.
I simply need my tools to not run like shit. There are things that you cannot do with GitHub anymore that you once were able to. For example, reviewing 500 file PRS might be an "anti-pattern" in HN's mind on average, but there are a lot of real-world scenarios at actual businesses where this comes up and there's literally nothing you can do about it other than have tools that don't suck at the job.
I've been working on a self-hosted GitHub alternative in my spare time. The goal is to have a lightweight public instance for discoverability and community over reasonably sized projects and to also offer a freeware download of the same for private and other self-hosting needs. The entire focus is on the simplest possible thing. Absolutely zero energy into AI, LLMs, client-side rendering circuses, etc. All that really matters to me is code/issues/pulls - primarily in B2B SaaS contexts. I'm just trying to run a small team and manage a codebase for 1-100 customers. I don't need to run an entire war and I certainly don't need my tools to entertain me with an endless stream of shiny distractions.
If you force push a new init commit then the previous commits are still available on github if you know the commit hash like github does right? At least in a PR you would see a clickable old commit hash.
Yes. It can also be shown in the Activity tab or accessed and scanned for secrets or personal information via the API. See https://news.ycombinator.com/item?id=44452623
Couldn't you ask GitHub to remove that information? Wouldn't laws like the GDPR allow you to ask for that?
In my experience, no they won’t help with gdpr takedowns. The only way to make things unavailable is actually to file a dmca notice against any url that you want hidden. This was actually the recommended approach from GitHub when I asked them about this. Absurd.
> The only way to make things unavailable is actually to file a dmca notice
Is it costly to do?
> In my experience, no they won’t help with gdpr takedowns
I would have expected that I could say "This is my code, hence this is my data, and I want you to remove my data from your website". I wonder how hard it is to file a complaint to the EU and see what happens.
I'm all for self-hosting like this or using less known services.
But what about discoverability? Is there any good search engine for code? These days I often find more interesting projects when searching code on GH rather than by name or description on repos.
> I'm all for self-hosting like this or using less known services.
I'm all for the idea of self-hosting in abstract, but in practice I don't want to spend my life looking after services.
I'm all for the idea of growing your own rice, but in practice I'd rather pay someone else to do it for me.
I'm not trolling, I'm just emphasizing a super super important aspect which HNers tend to minimize, which is that self-hosting is work and even people who are technically able to do it might not want to.
I think its like 3 years that i put the upgrade commands into a cronjob and stopped looking after the server. It hasn't broken so far & is up-to-date. I think it only works with dumb and simple software stacks, though.
I suppose that the TFA author is lowering their load by not having to respond to inane issues, which is also work.
Otherwise, yes, non-essential things are best delegated, especially when it's free. Some essential things also have to be delegated, if doing them yourself takes too much (like growing rice). The latter works best when the essential thing is a commodity with a wide choice of suppliers (like rice).
Past the initial setup, really there's not much to look after outside of the odd issue here and there which can usually just be a "go restart container" solution.
In the age of Docker & co., people really overestimate how much effort it is to self-host a service. Probably it’s the entire generation that never ventured outside using AWS managed services.
In practice, if you do not require 99.99% uptime SLAs, all you do is connect every 6 months to do a docker pull and update the image.
You can setup mirroring to a github repo, with a repo description "This is a read-only mirror of <other non-github repo>". People find the project on GitHub and from there are directed to your self-hosted Gitea/Gitlab etc.
There's a project idea.
One issue that comes with leaving GitHub is a higher barrier to contributing. The author appears to see this as a nice filter, but it may not make sense for you. With a self-hosted forge, a new contributor will need to:
a) Sign up for an account in your forge: Do contributors really want another account? Does your captcha/email verification actually work (I've encountered ones that don't)? There are also forges that require you to ask for an account which is another hurdle.
b) Send an email: Configuring `git send-email` is alien to many contributors and may not even be doable in some corporate environments (OAuth2 with no app passwords allowed). Diverging from this is error-prone and against social norms which the contributor may not even be aware of (until they get flamed in the mailing list). You are also giving up automated CI which is a big part of the contributor feedback loop.
To be clear, going independent does indeed work for small personal projects (do not care much about contributions) as well as established ones (large incentive for new contributors to jump over hoops), and I'm fully aware that a lot of HNers do not see the need for those "niceties" provided by GitHub. But I feel that people often underestimate the barriers that they are putting up.
I believe the slightly higher barrier is a feature, and a good filter for low quality spam.
On the other hand, if I spent time and effort writing a patch for public release, I have no issues jumping through hurdles to see it published, whether I have to create an account or learning the correct incantation to git send-email. Usually the thing that stops contribution is finding the time and will to prepare a PR for review; in comparison to that effort, creating an account is trivial.
The way I see it, using a distributed VCS like git benefits from having a distributed ecosystem. Putting everything in Microsoft’s hand for them to train their commercial AI product on your code is a little reckless and short-sighted. And we could do with fewer siloes.
It's also not a wasted skill. Once you've learned how to `git send-email` for the sake of one project, you're now prepared to do the same for others.
I have had my ebbs and flows with github. I think one of the difficult things about moving away from it, is the loss of network effects for open source projects. If you have no public repos or don't care about contributions or discovery, this doesn't matter.
I know the author is accepting patches, but even learning that their forge exists, and that the user wants to contribute is stymied if the users search for projects starts at a github search.
I personally agree with the author that worthwhile contributors will not be stopped by the fact that the project is not on GitHub.
Why would you think worthwhile contributors are immune to being detracted by all the extra barriers?
Just my opinion based on my experience. "All the extra barriers" in this case being "learning how to send a patch over email".
I would argue that fewer of those "worthwhile contributors" will find the software to begin with when you reduce discoverability.
Depending on the use case and who is using it, the easiest option to self-host git is a git --bare init command in a server where you can ssh.
Of course it has no interface, etc. but it is functional to push and share commits, and there is nothing to maintain other than ssh credentials.
That's like going back into a cave to protest modern housing development
> This is not a technical, but a social problem.
Indeed, and you've offered no solution by erecting a lot of social barriers
Note that it's possible to self-host SourceHut and Forgejo.
What's unique to GitHub is not code hosting, UI or CI. Rather it is GitHub stars.
People trust a project with 5K stars on GitHub.com more than a self-hosted project.
VCs fund startups based on the star history.
Big companies decide which FOSS projects to use based on the star history.
Good thing you can buy stars online!
Why do people trust these useless metrics? Ah yes because reading the code and deciding if it's any good is difficult.
> Ah yes because reading the code and deciding if it's any good is difficult.
No one has infinite time in the universe to read the code of all alternatives before deciding which one is best for their use case.
GitHub stars are a filtering mechanism.
Most engineers, when given 5 projects with the following star count - 5K, 2K, 500, 200, 100, will only evaluate the code of the first two projects.
> Most engineers, when given 5 projects with the following star count - 5K, 2K, 500, 200, 100, will only evaluate the code of the first two projects.
Then most engineers are not doing their job properly, if you ask me.
It is incredibly difficult to assess the quality of projects in depth.
I tried doing this for Open-VSX extensions for handling justfile:
https://open-vsx.org/?search=justfile&sortBy=relevance&sortO...
Go by number of downloads? skellock's extension wasn't updated in 5 years, is marked archived on GitHub, and doesn't accept any fixes (and there's a handful of reported problems that could use fixing). So while its README suggests that it once had a lot of effort put into it, it's not the choice that will grow with you.kokakiwi's has nextmost downloads, but its website and git repository is self-hosted on a site that was gone since December 2020, so also 5 years of staleness. I suppose that's another way to archive your extension.
nefrob's has fewer downloads yet, but got a single 5-star review (Open-VSX doesn't have a lot of active users), the GitHub repo was updated 3 months ago, and it seems alive. Also, the parser itself seems simplistic and admits to not get things perfect.
wolfmah's has been inactive for more than one year, and it contains a single commit.
It was even less obvious with Typst: There are 12 results for the keyword "typst", and the leading extension didn't have many downloads at all -- I can see now that it's #2 for downloads.
Altogether, downloads / stars / reviews are a great way to get an honest answer in many cases when there aren't commercial interests at play, since there are fewer incentives to game metrics.
But… there are commercial interests at play.
The first 2 projects are most likely to be VC backed and do a rug pull in the next 5 years.
> The first 2 projects are most likely to be VC backed and do a rug pull in the next 5 years.
The first two are more likely to get VC backing. Rug pull is a false assumption.
100% guaranteed when VCs are involved that something nasty will happen to the project.
edit: lol you're an angel investor and I guess you feel attacked. I stand by what I said.
> 100% guaranteed when VCs are involved that something nasty will happen to the project.
I presume you don't use any VC-backed projects like Google, Apple, Amazon, Facebook, etc.
Those are the absolutely worst examples you could provide. Every single one of those companies has been in the headlines for how demonstrably worse they have gotten over the years, even being pulled before congress in some cases.
You have proven your adversary's point exactly.
I mean, if the opposition is GitHub training on your stuff - or any AI, for that matter - self-hosting doesn't save you. This feels like a classic "tech thinking it can solve a politics battle". If you're making a self-determined ethical decision then I get it though.
I also don't get why you wouldn't go for something like Forgejo/Gitea, or if you're really dead set on the email pattern like the author seems to, Sourcehut.
I think the sourcehut guy is not the kind of person I would rely on. I followed his blog for a while and now I've stopped.
I moved my personal stuff to codeberg. There's also the advantage that fewer people have accounts so I get fewer pull requests that change thousands of lines and fail every single test.
Of course I could be wrong, predicting the future is difficult.
The codeberg.page domain, which projects can use to publish their documentation, ended up in some spam list so to link to ltworf.codeberg.page in my email signature I must link ltworf.github.io which then redirects to the real one.
Imagine the noise if github ended up on a spam list.
I know this is going to sound silly, but how does one setup their own git server?
https://git-scm.com/book/en/v2/Git-on-the-Server-Setting-Up-...
It's literally far more trivial than you may think. (At least if you already own a VPS or a dedicated server.)
Basically, (1), add a git user, (2), configure .ssh/authorized_keys, and that's basically it!
To add a bit more security, you may also want to edit /etc/shells and add git-shell as a login shell, then change git user to use that shell, but that's entirely optional (and it'd actually make it less trivial to create add extra repos, so, for a single-user repos, you may want to delay doing that).
In all, the entire set of official instructions can be printed on like a single double-sided Letter / A4 piece of paper.
P.S. Actually, I guess it's 3 single-sided pages, but that's only because you'd be printing the example SSH key twice, which takes a quarter of an A4 each time, thus, the entire official instructions take almost 3 single-sided pages total.
Private repos: git init --bare in your homedir on the server, and then push/pull using user@hostname:path/from/homedir/to/repo.git as remote url.
Optionally, public repos: Set-up git-daemon in some /var directoy and do the git init dance there. Also touch the git-daemon-export-ok file to whitelist each repo for public serving. chown the repository directory to your own user so you can write there freely, while git-dameon can read and serve the contents to the world.
git works over ssh. If you have an account you can ssh into you have a git server. That's literally all the git@(whatever) is under the hood that you set your remotes to.
Everything is just dumping UI over a file system and executing command line stuff.
If you want that too take a look at forgejo or gitlab, both of which I use for self hosting. The setup is similar to setting up any other server that requires user accounts, a database, and exposed ports.
As a lightweight solution (no web UI), there's Charm's Soft Serve[1]. Their product byline: "A tasty, self-hostable Git server for the command line."
[1] https://github.com/charmbracelet/soft-serve
I’ve been running rgit, which is a nicer frontend than cgit. If you really want SSH, you can either use a full fledged server like Gitea/Gogs/Forgejo etc or run something on top of your existing SSH daemon like gitolite, which manages authorized_keys for you.
This is also a good reminder that you technically don’t need a “server”. If you and your friend want to collaborate, you could in principle pull from their computer (and vice versa), if you have (ssh) access.
brew install git (guessing, not a Mac user)
sudo apt-get install git
sudo dnf install git
(Etc)
Re macOS: if you have Xcode w/ command line tools installed, it will already have git installed
What's the server included with those packages? I thought those were just the command line utilities.
True. Git works over ssh natively. you set up a repo in your homedir using “git init —bare”. Say in a repo subdir. Then doing git clone user@server:subdir will clone that repo on another machine and allow push and pull over ssh.
In other words, the server is sshd.
Enjoy!
A standard git package includes gitweb, a web server for your repos. For a more feature-full web server, you have choices such as forgejo. As others have said, you don't need a server, ssh access is all that's needed.
whispers "Pst, it's the same!”
I beg to differ as those, by themselves, will not give you a server as was requested. The examples upthread all include additional software, such as sshd, gitea, etc, for a reason. necessary != sufficient
My question was posed because I believe the post did not read the original request carefully.
You don't need gitea for git and every Linux server has ssh, this doesn't make any sense.
If you really think you can claim "every machine with git installed has an ssh server enabled" (and is somehow linux to boot) and that you have convinced yourself "that makes sense" I don't know what to tell you other than your experience is shockingly narrow.
I am specifically talking about Linux, did you misread?
My experience is managing over a hundred thousand vms at once across multiple countries, and have managed large fleets of tens of thousands or hundreds thousand+ for decades. This is old hat.
My desktop runs Linux. My laptop runs Linux. My phone runs Linux. The phone is the only one without a running SSH server. I'm pretty sure the openssh server is also a default package in Fedora Workstation, it's just got the service disabled by default unless you flick on a check box in the Gnome window.
Windows Server also supports SSH, and has several shell options.
I couldn't answer for any Apple device.
I guess my experience is narrow, though.
Hey, how many git servers do you know not running Linux? That's an interesting question!
Edit: Curious what you think both GitLab and GitHub run on, and if those have ssh enabled or not.
Easy mode is: deploy Gitea.
Posts like "I'm leaving Facebook forever!", "Netflix has ads now!", or "Twitter is ruined!" always come off as angsty and self-important.
This one's no different. I didn't even know Copilot was the issue until halfway through -- by then, I'd stopped caring.
If Copilot is your dealbreaker, fine. But don't bury the lead. Just say what you're switching to and why it's better. That's the useful part.
Bury the lede is the correct phrase. Otherwise I think you are spot on.
Not noticing the spelling issue at first, I read your comment as "You also buried the lede. Otherwise I think you are spot on".
Interesting. Seems both are now acceptable.
https://www.merriam-webster.com/wordplay/bury-the-lede-versu...
The etymology is:
> A deliberate misspelling of lead, originally used in instructions given to printers to indicate which paragraphs constitute the lede, intended to avoid confusion with the word lead which may actually appear in the text of an article. Compare dek (“subhead”) (modified from deck) and hed (“headline”) (from head).
Further:
> In 1990, the American author and journalist William Safire (1929–2009) was still able to say: “You will not find this spelling in dictionaries; it is still an insiders' variant, steadily growing in frequency of use. […] Will lede break out of its insider status and find its way into general use? […] To suggest this is becoming standard would be misledeing […] But it has earned its place as a variant spelling, soon to overtake the original spelling for the beginning of a news article."