

Somewhere else this was posted the author stated they wrote it themselves, but took grammar corrections from grammarly. That probably created the Ai vibes.
They already got a lot of flak for it there.


Somewhere else this was posted the author stated they wrote it themselves, but took grammar corrections from grammarly. That probably created the Ai vibes.
They already got a lot of flak for it there.
The secret ingredient to make chocolate taste good is tears of the slave kids harvesting it. /S


Curl has a limited buffer and bash reads a line and then executes it, before reading the next line.
So first you need a command that takes time if executed. So a delay, downloading a big file, user input work. Next you fill up the buffer. Just your normal script. Maybe some comments etc.
Now the server can detect if after the first kB the stream stops.


It’s even worse. The server can detect if you are piping it straight into a shell or just downloading the file. It can then send different scripts based on that.


Most people expect a domain to work without adding 8080 as a port number in the URL. Hell, I’d say a majority don’t even know that it’s possible.
That’s not going to happen with the current tech. Maybe someone comparable comes out with those specs.
There is also the mathfont of Tex, which slowly converges to e.
As a specific example: it is used in control loops to accurately describe your system. If you have an accurate description it then becomes trivial to describe the PID controller to manage it. Going from open to closed loop is as simple as adding +1 to your equation for example.


Consider using Caddy. It is much simpler to setup and all the required headers get set automatically.
Golf isn’t you must contribute your changes.
It just grants a user and only a user the right to get a copy, that is build able.
A diff tree, or the forced upstreaming is not required. You don’t even have to provide a copy to the original author unless they are a user!
There is always the option of waydroid to get android apps running on Linux. It’s not a great solution, but a first stopgap measure to use services only available as apps.


Oh, he was also around that time part of a German party assembly. You know of the party where top politicians say things like. “The Nazis where not that bad” or " The Problem with the holocaust was, that it didn’t finish." That was probably an accident and misunderstanding too.
https://www.tagesschau.de/inland/bundestagswahl/parteien/musk-afd-wahlkampfshow-100.html


In a project I’m in there are 20 commits just labeled .. The only reason I haven’t slapped them silly is they left before I started.


A 3d map for example?


No. The rate limit doesn’t work as they use huge IP Spaces to crawl. Each IP alone is not bad they just use several thousand of them.
Using the API would assume some basic changes. We don’t do that here. If they wanted that, they could run their own instance and would even get notified about changes. No crawling required at all.
You might want to open issues for the things you found.


Probably because the creator had a blog post that got shared around at a point in time where this exact problem was resonating with users.
It’s not always about being first but about marketing.
Git was Made for the Linux kernel. The kernel is pretty much only text files. For the complete decentralisation git achieves an easy diffing and merging operations needs to be defined. It is working for what it was made.
Large files don’t work with git, as it always stores the whole history on your drive.
For files that are large and not mergeable SVN works better and that is fine. You need constant online connectivity as a trade of though.
Some build tools for software being the option to define a dependency as a git path +commit or a local path. That works quite well but is in the end just a workaround.


It also is an option to ensure everyone has the same dev environment.
No TMR.