Hacker Newsnew | past | comments | ask | show | jobs | submit | deepsun's commentslogin

And not a single mention of GitLab. I remember it was a pretty serious contender, sometimes leader, strange that author doesn't mention it.

Maybe it's better to pull that dependency source in your action altogether?

Better to treat it as a dependency still, but audit each new commit/release as it comes in, and pin to the exact last commit id that you verified.

I hadn't previously considered vendoring GHA dependencies, but yes, that might be a good idea. Perhaps not in all circumstances, but for anything that might be at risk of supply-chain compromise, the same arguments that apply to NPM apply to GHA.

"follow the law" in contracts IMO is there to be able to claim a "breach of contract" by one party.

No, why? Israel would celebrate.

But NONE of the Arab countries want to help Gaza people really.


> No, why? Israel would celebrate.

This is directly contradicted by Israel's actions in the Gaza War. Egyptian control of the crossing was not enough, so they took it. https://www.bloomberg.com/news/articles/2024-05-07/israel-ra...


Israel would object to aid and weapons flows into Gaza. It would be fine with Gazans leaving the Strip. The problem is there are currently zero takers globally for a significant Palestinian refugee population, in part, as other comments have mentioned, due to the history of Palestinian refugee populations in the Middle East. (To my knowledge, Palestinian Americans have been fine and productive members of society.)

> To my knowledge, Palestinian Americans have been fine and productive members of society

With a few notable exceptions... A Palestinian-American murdered Bobby Kennedy for being too supportive of Israel.

https://en.wikipedia.org/wiki/Sirhan_Sirhan


if egypt opened the border, it would mean weapons and bombs flowing from egypt into gaza.

thats not something israel would be excited about


More like refugees flowing out, which Egypt doesn't want to deal with.

The Palestinians didn't help their cause with Yasser Arafat's Black September uprising in Jordan. Then they topped that up with strong support for Saddam when he invaded Kuwait. Like the ones in Kuwait were literally betraying Kuwaitis to the Iraqi troops.

Oh, and did I forget Lebanon? They literally fomented the civil war.


I mean "open the border" to allow Gazans to leave to Egypt. But Egypt (and none other Arab countries) are accepting refugees from Gaza.

-- Moshe, why are you keep reading anti-Semitic papers? -- I just like to hear how powerful and clever we are.

Defense of Israel was the primary justification offered in a recent State department memo asserting the legal basis for the war with Iran. Unusually, its publication was not announced on social media or to the press, unlike most state department official pronouncements. Anyway, rather than being opinion, this is (for the present) the official position of the United States government.

https://www.state.gov/releases/office-of-the-legal-adviser/2...


> Defense of Israel was the primary justification offered in a recent State department memo asserting the legal basis for the war with Iran.

It's funnier than that. The justification is "self-defense of its [the USA's] Israeli ally".


No, that's a common myth that a random sequence will eventually see every possible subsequence.

Easy to construct a sequence of just 1 and 0 thats infinite and not repeating, without having a single 2 nor 3.


But that’s not what’s happening here - the model here is that every piece has a uniform probability of being selected next, in which case every possible subsequence appears in the limit.

Yes, but "in limit" does not mean "will happen".

E.g. it might be that there will never be "I" piece at all, even for infinite random sequence. Yes, probability of that happening is exactly zero, but that might happen.

For example, if we select at random any number between 0.0 and 1.0 -- probability of selecting it is exactly zero. But we still selected it.


While a sequence where one possible subsequence never appears has probability zero in the limit, it’s still a possible random outcome. Incidentally, every concrete infinite sequence has probability zero.

> We don’t know all of the details of human biology.

Tautology statement. "We don't know all the details of X" for any X.


That is not a tautology. But you're right it's trivially true.

> In logic and mathematics, it is a statement that is true in every possible interpretation or situation.

... by definition

Tautology day is today because today is tautology day.

... not by "obviousness"

I can't check every stone for moss. We can't know every detail about anything.


Is it also possible to power a laptop through those adapters? PoE++ can deliver up to 100W of power, more than enough for most laptops.

Theoretically yes, practically that hasn't been built yet. I've only seen it for 2.5Gbase-T, and only for 802.3bt Type 3 (51W).

If anyone's aware of something better, I'd be interested too :)

(Then again I wouldn't voluntarily use 5Gb-T or 10Gb-T anyway, and ≈50W is enough for most use cases.)

[ed.: https://www.aliexpress.us/item/3256807960919319.html ("2.5GPD2CBT-20V" variant) - actually 2.5G not 1G as I wrote initially]


Eh.

A lot of laptops won't accept less than 60w

My work laptop won't accept less than 90w (A modern HP, i7 155h with a random low end GPU)

At first everyone at the office just assumed that the USB C wasn't able to charge the pc


I gotta say, I love my macbooks. Every Apple laptop I've owned that has USB-C ports will happily charge itself from a 5V/1.5A wall charger (albeit extremely slowly).

That hasn’t been my experience. I once tried to charge an M3 MBP via a lower powered wall plug. It was left off over night and the following morning the battery was still at 1%.

Note:

Some devices expect USB-A on the charger side instead of C

USB-A pump out 1A5V(5W) regardless of what's connected to it, then it negotiate higher power if available.

USB C-C does not give any power if the receiving device is not able to negotiate it


My work has a little power strip with a usb-c and usb-a jack on it at every desk. I can charge my phone and iPad just fine with a USB-C cable into the USB-C port, but when I plugged my MacBook Air into it, it says “not charging.” Going into the system information tool I can see it’s only running at 10W. So apparently 10W is not enough to charge, but it’s still at least keeping the battery from draining.

A 20w charger will definitely charge the MacBook, just slowly.


This was a decent USB plug from Anker. I regularly use it to charge things like iPhones and tablets. I knew it wouldn’t supply enough power to run the MBP but thought it should trickle charge the device over night. But it didn’t.

I can’t recall which cable I used though. The cable might have been garbage but I’m pretty sure I threw out all the older USB cables so they wouldn’t get mixed with more modern supporting cables.


What did it start at?


They probably require higher voltages but I havent seen one myself. I usually just charge y laptop with my phone charger, what is it, 18 watts? Don't care, charges my laptop and the phone that is plugged into it overnight. Why charge at faster speeds when there is no need to

Laptop charges fine regular 5V as well.


My Thinkpad T490 will happily take any power provided voltage is high enough (15V+).

The issue might not be the wattage bit rather the minimum voltage. (Some?) Macs seems to charge at 15v already, most laptops need 20v

Coincidentally, the USB-C spec is written such that wattage implies a minimum set of supported voltages:

* ≤15W charger: must have 5V

* ≤27W charger: must have 5V & 9V

* ≤45W charger: must have 5V & 9V & 15V

* (OT but worth noting: >60W: requires "chipped" cable.)

* ≤100W charger: must have 5V & 9V & 15V & 20V

(levels above this starting to become relevant for the new 240W stuff)

(36W/12V doesn't exist anymore in PD 3.0. There seems to be a pattern with 140W @ 28V now, and then 240W at 48V, I haven't checked what's actually in the specs now for those, vs. what's just "herd agreement".)

Some devices are built to only charge from 20V, which means you need to buy a 45.000001W (scnr) charger to be sure it'll charge. If I remember correctly, requiring a minimum wattage to charge is permitted by the standard, so if the device requires a 46W charger it can assume it'll get 15V. Not sure about what exactly the spec says there, though.

(Of course the chargers may support higher voltages at lower power, but that'd cost money to build so they pretty much don't.)

NB: the lower voltages are all mandatory to support for higher powered chargers to be spec compliant. Some that don't do that exist — they're not spec compliant.


It's a 3A supply up to the 100W one, that gets upped to 5A at higher voltages.

Varying voltage power supplies are usually capped by current, not power. That's because many of the components, set maximum current and voltage that you must obey independently.

At higher voltages people start accepting higher loses in stuff like cables, because fire-safety becomes a more important concern than efficiency. So the standard relaxes things a little bit.


You're correct but it's irrelevant. My point was that these requirements are in the standard and if you want to put the USB logo on a power brick you need to meet them. And the consumer is intended to be able to rely on them - which was & still is a pretty good idea considering the USB-C cable carnage.

I wish they did something like this for USB-C cables, but it's probably too late.


My laptop has

    $ upower -i $(upower -e | grep BAT)
    [...]
        voltage-min-design:  11.58 V
And I can charge it via USB-C using a 22.5W powerbank @ 12V (HP EliteBook 845 G10.)

I guess that would be out of spec then?

edit: nvm I didn't see the qualifier 'minimum'


  voltage-min-design:  11.58 V
This has nothing to do with USB-C, this is the minimum design voltage of your lithium ion battery pack. In this case, you have a 4-cell pack, and if the cells drop below 2.895V that means they're physically f*cked and HP would like to sell you a new battery. (Sometimes that can be fixed by trickle charging, depending on how badly f*cked the battery is.)

If your laptop's USB-C circuitry were built for it, you could charge it from 5V. (Slowly, of course.) It's not even that much of a stretch given laptops are built with "NVDC"¹ power systems, and any charger input goes into a buck-boost voltage regulator anyway.

¹ google "NVDC power", e.g. https://www.monolithicpower.com/en/learning/resources/batter... (scroll down to it)


Thanks for the write up, I didn't know that.

A Mac mini at home used 4.64w averaged over the last 30 days. Even under load it just sips power.

It can draw a lot more under load? https://support.apple.com/en-gb/103253

I’m sure it can, but even that could be supplied by POE++ I think?

Mine under very rarely exceeds 10w.


Great. So we got EU laws to mandate USB-C chargers and then get manufacturers that flaunt the spirit of the law by rejecting lower wattages.

My laptop refuses to charge for 45W chargers as well, but I can almost understand it.

When plugged into 100W chargers while powered on, it takes ten minutes to gain a single percentage point. Idle in power save may let me charge the thing in a few hours. If I start playing video, the battery slowly drains.

If your laptop is part space heater, like most laptops with Nvidia GPUs in them seem to be, using a low power adapter like that is pretty useless.

Also, 100W chargers are what, 25 euros these days? An OEM charger costs about 120 so the USB-C plan still works out.

Other manufacturers do similar things. Apple accepts lower wattage chargers (because that's what they sell themselves) but they ignore two power negotiation standards and only supports the very latest, which isn't in many affordable chargers, limiting the fast charge capacity for third parties.


Which laptop is that? My Razer with 5070 will take 45W chargers just fine, so do the ThinkPads, my work 16" MacBook and previous Asus Zephyrus with 4070.

I was on a trip a few years ago and had only brought my “compact” 45w usb-c charger since the brick that came with my work ThinkPad (one of the high end 16” screen models, maybe p16?) was enormous. When I plugged it in Windows complained that the charger was insufficient to charge the laptop. I think it at least kept it from draining the battery though. I had to run to Walmart and get a 65w charger which did the job fine.

The idea is that you can use chargers that you have lying around. In an emergency I charged my MacBook Pro with an old 5 or 10W adapter overnight while shut down. I don’t see the reason for flat out refusing a charge. Especially when turned off.

Most laptops will take 45W. There might be some workstations that don't, but even gaming stuff with 5080s will charge on 45W.

The idea of a POE Mac mini makes me happy. It would be a nice way of power cycling it from the switch, tidier than the smart plug I have.

https://hackaday.com/2023/08/14/adding-power-over-ethernet-s...


It's undoubtably a cool solution, but in why do you need to remotely do a hard power cycle? Won't just SSHing in and rebooting be enough?

And when ssh is down because you OOMd or something else?

I don't really run heavy loads on my home server, so I haven't thought of that

Makes sense, thanks!


With 802.3bt type 4 (71W delivered, 90W consumed), absolutely achievable with the proper electronics, but would you trust a no-name, fly-by-night NIC to not fry your expensive devices? That's the biggest hurdle. Possibly a company like Apple, Anker, or similar megacorp or high-trust startup could pull if off.

Somewhat, there are a few expensive "PoE to Data + Power" adapters out there

https://www.procetpoe.com/poe-usb-converter/ (some of these are power-only)


PoE Texas sells the most compatible adapters for this use.

https://shop.poetexas.com/products/gbt-usbc-pd-usbc?variant=...

65W 802.3bt and gigabit Ethernet out on the same PD cable.

Also a crude fixed hub for data and a keyboard and mouse for docking laptops:

https://shop.poetexas.com/products/bt-usbc-a-pd?variant=3938...


Doing home automation of lamps, sensors, speakers via PoE would be great too. It should faster and more stable than Zigbee/Wifi and with no need to change the batteries often.

We used PoE hats for a bunch of Raspberry Pis once. It’s definitely a great idea.

I found a 5gbe one that claimed 60W, will power a phone but not the low power laptop I've got here. It probably isn't far off.

I can’t find what you want, but you can buy PoE splitters. PoE in, ethernet and power out.

Surely a matter of time until someone does this…


I think class 4 tops out at about 71W delivered to the powered device, albeit 90W at the switch port.

Might be a struggle I suspect!


Yes, but look up the prices for PoE switches and you might reconsider.

PoE can be cheap, but usually never cheaper than non-poe. But if you have a PoE switch and spare ports, its very nice.

The problem comes when you try to design a large network and need random PoE ports on end devices where you can't home-run a cable back.

I have a Unifi Pro XG 48 PoE and I love it, but I still don't use PoE for everything. The cost of a (non unifi) poe device + the cost of using one of those ports always exceeds a simple power adapter on the other side (if possible).

I think about this a lot.


What's wrong with "webapp to configure a keyboard"? It's the same as "app to configure a keyboard", just with another option to run it straight from a website, without installation.

It made possible when chromium-based browsers (Chrome, Edge, Opera) implemented WebHID API.

Now, if I couldn't download it and run myself, that would be a different story (vendor lock-in). But I can, so I think WebHID is godsend.

PS: By the way, the most popular RC (quadcopters/airplanes/helicopters) flight controller configurer is https://app.betaflight.com/ . It's a pretty complex tool with a tons of bells and whistles.


Remember jabber/xmpp? At least they tried to interoperate. Google Talk at the beginning had interoperability as its main feature, but Google quickly scrapped that.

UPDATE: some say that's because XMPP was too encompassing of a standard (if a format allows to do too much it loses usefulness, like saying that binary files format can store anything). IMO that's not the reason, they could just support they own subset. They scrapped interoperability for competition only IMO.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: