After getting scammed on Facebook Marketplace, I look at the profiles of sellers, particularly if they don’t have much in way of reviews. That seems more prudent than creepy to me. I’m not stalking anyone and I’m not looking to be their friend.
Is there a better way to do seller verification? It does seem like an information leak to me. Craigslist and eBay don’t share my identification as a potential buyer. I don’t love the marketplace being tied to a social network, but it’s what many people are using these days.
sure, showing up on suggested friends is weird. the way linkedin does it makes more sense: "these people have viewed your profile". i was picking up on hiding it outright. while that may be justified in your case, it's also reasonable to let them know.
the only people i would really not want to find out that i look at their profile are spammers and scammers (oh, and stalkers).
so both sides have a fair reason. so guess, if you can, choose the social network that works the way you prefer.
sneaking up to someones house and peeping in theier windows is creepy. or just camping out in front of their window from the street legally.
but that person had to put their info into the website, themselves, by choice, and then chose to let their privacy settings be such that others can view them.
if you pin your photo up to a cork board, don't be surprised if people see it
but the reverse is true too. if you look someone up, don't be surprised if they find out. really, i don't see how that would be a big deal.
with more and more illegitimate tracking being done, informing those being tracked seems a benefit, not a drawback.
there is a difference however between one institution tracking who all the people are that i am looking at, vs the person i am looking at finding out for themselves who is looking at them.
what i understood is that "showing up on their suggested friends list is creepy, and it's an information leak". the way i read that is that they would prefer not to show when someone visited their profile. and that's what i consider creepy.
GPUs do not wear down from being ran at 100%, unless they're pushed past their voltage limits, or gravely overheating.
You can buy a GPU that's been used to mine bitcoin for 5 years with zero downtime, and as long as it's been properly taken care of (or better, undervolted), that GPU functions the exact same as a 5 year old GPU in your PC. Probably even better.
GPUs are rated to do 100%, all the time. That's the point. Otherwise it'd be 115%.
No, you're fundamentally wrong. There's the regular wear & tear of GPUs that all have varying levels of quality, you'll have blown capacitors (just as you do with any piece of hardware), but running in a datacenter does not damage them more. If anything, they're better taken care of and will last longer. However, since instead of having one 5090 in a computer somewhere, you have a million of them. A 1% failure rate quickly makes a big number. My example included mining bitcoin because, just like datacenters, they were running in massive farms of thousands of devices. We have the proof and the numbers, running at full load with proper cooling and no over voltage does not damage hardware.
The only reason they're "perishable" is because of the GPU arms race, where renewing them every 5 years is likely to be worth the investment for the gains you make in power efficiency.
Do you think Google has a pile of millions of older TPUs they threw out because they all failed, when chips are basically impossible to recycle ? No, they keep using them, they're serving your nanobanana prompts.
GPU bitcoin mining rigs had a high failure rate too. It was quite common to run at 80% power to keep them going longer. That's before taking into account that the more recent generations of GPUs seems to be a lot more fragile in general.
Yeah what's crazy is most of these companies are making accounting choices that obscure the true cost. By extending the stated useful life of their equipment, in some cases from 3 years to 6. Perfectly legal. And it has the effect of suppressing depreciation expenses and inflating reported earnings.
That's literally graphql though? That's the whole idea. This is so weis, clearly the author is aware of graphql but then just decided to reimplement half the backend? Why?
Problem: Claude Code 2.1.0 crashes with Invalid Version: 2.1.0 (2026-01-07) because the CHANGELOG.md format changed to include dates in version headers (e.g., ## 2.1.0 (2026-01-07)). The code parses these headers as object keys and tries to sort them using semver's .gt() function, which can't parse version strings with date suffixes.
Affected functions: W37, gw0, and an unnamed function around line 3091 that fetches recent release notes.
Fix: Wrap version strings with semver.coerce() before comparison. Run these 4 sed commands on cli.js:
CLI_JS="$HOME/.nvm/versions/node/$(node -v)/lib/node_modules/@anthropic-ai/claude-code/cli.js"
# Backup first
cp "$CLI_JS" "$CLI_JS.backup"
# Patch 1: Fix ve2.gt sort (recent release notes)
sed -i 's/Object\.keys(B)\.sort((Y,J)=>ve2\.gt(Y,J,{loose:!0})?-1:1)/Object.keys(B).sort((Y,J)=>ve2.gt(ve2.coerce(Y),ve2.coerce(J),{loose:!0})?-1:1)/g' "$CLI_JS"
# Patch 2: Fix gw0 sort
sed -i 's/sort((G,Z)=>Wt\.gt(G,Z,{loose:!0})?1:-1)/sort((G,Z)=>Wt.gt(Wt.coerce(G),Wt.coerce(Z),{loose:!0})?1:-1)/g' "$CLI_JS"
# Patch 3: Fix W37 filter
sed -i 's/filter((\[J\])=>!Y||Wt\.gt(J,Y,{loose:!0}))/filter(([J])=>!Y||Wt.gt(Wt.coerce(J),Y,{loose:!0}))/g' "$CLI_JS"
# Patch 4: Fix W37 sort
sed -i 's/sort((\[J\],\[X\])=>Wt\.gt(J,X,{loose:!0})?-1:1)/sort(([J],[X])=>Wt.gt(Wt.coerce(J),Wt.coerce(X),{loose:!0})?-1:1)/g' "$CLI_JS"
Note: If installed via different method, adjust CLI_JS path accordingly (e.g., /usr/lib/node_modules/@anthropic-ai/claude-code/cli.js).
It's a good idea. Obviously you can't preemptively OCR all images but having "context menu -> follow link" which works on QR codes and images with links in them seems totally doable do me
I think that a separate program might be better, which could be used with any video source in any program, including a QR code made up from multiple pictures or from CSS, or a video, etc. Furthermore, it is not necessarily a URL.
However, it might also be made as a browser extension in case you do not want to use a separate program, or if you want to be able to follow such links directly without going through another program.
The datacenter OS doesn't have to be the same as the developer OS. At my work (of similar scale) the datacenters all run Linux but very nearly all developers are on MacOS
MacOSX is a popular choice for dev boxes (if I understand correctly, they provide some pretty good tooling for managing a fleet of machines; more expensive hardware than a Linux dev machine fleet, but less DIY for company-wide administration).
... but Google solves the "A Linux fleet requires investment to maintain" problem by investing. They maintain their own in-house distro.
Not really, it is just a well known outside distro plus internal CI servers to make sure that newly updated packages don't break things. Also some internal tools, of course.
Relative to what the rest of the world does, that is maintaining your own in-house distro.
It's downstream of Ubuntu (unless that's changed) but it's tweaked in the ways you've noted (trying to remember if they also maintain their own package mirrors or if they trust apt to fetch from public repositories; that's a detail I no longer recall).
Elon Musk?!
reply