I hadn't previously considered vendoring GHA dependencies, but yes, that might be a good idea. Perhaps not in all circumstances, but for anything that might be at risk of supply-chain compromise, the same arguments that apply to NPM apply to GHA.
Israel would object to aid and weapons flows into Gaza. It would be fine with Gazans leaving the Strip. The problem is there are currently zero takers globally for a significant Palestinian refugee population, in part, as other comments have mentioned, due to the history of Palestinian refugee populations in the Middle East. (To my knowledge, Palestinian Americans have been fine and productive members of society.)
More like refugees flowing out, which Egypt doesn't want to deal with.
The Palestinians didn't help their cause with Yasser Arafat's Black September uprising in Jordan. Then they topped that up with strong support for Saddam when he invaded Kuwait. Like the ones in Kuwait were literally betraying Kuwaitis to the Iraqi troops.
Oh, and did I forget Lebanon? They literally fomented the civil war.
Defense of Israel was the primary justification offered in a recent State department memo asserting the legal basis for the war with Iran. Unusually, its publication was not announced on social media or to the press, unlike most state department official pronouncements. Anyway, rather than being opinion, this is (for the present) the official position of the United States government.
But that’s not what’s happening here - the model here is that every piece has a uniform probability of being selected next, in which case every possible subsequence appears in the limit.
E.g. it might be that there will never be "I" piece at all, even for infinite random sequence. Yes, probability of that happening is exactly zero, but that might happen.
For example, if we select at random any number between 0.0 and 1.0 -- probability of selecting it is exactly zero. But we still selected it.
While a sequence where one possible subsequence never appears has probability zero in the limit, it’s still a possible random outcome. Incidentally, every concrete infinite sequence has probability zero.
I gotta say, I love my macbooks. Every Apple laptop I've owned that has USB-C ports will happily charge itself from a 5V/1.5A wall charger (albeit extremely slowly).
That hasn’t been my experience. I once tried to charge an M3 MBP via a lower powered wall plug. It was left off over night and the following morning the battery was still at 1%.
My work has a little power strip with a usb-c and usb-a jack on it at every desk. I can charge my phone and iPad just fine with a USB-C cable into the USB-C port, but when I plugged my MacBook Air into it, it says “not charging.” Going into the system information tool I can see it’s only running at 10W. So apparently 10W is not enough to charge, but it’s still at least keeping the battery from draining.
A 20w charger will definitely charge the MacBook, just slowly.
This was a decent USB plug from Anker. I regularly use it to charge things like iPhones and tablets. I knew it wouldn’t supply enough power to run the MBP but thought it should trickle charge the device over night. But it didn’t.
I can’t recall which cable I used though. The cable might have been garbage but I’m pretty sure I threw out all the older USB cables so they wouldn’t get mixed with more modern supporting cables.
They probably require higher voltages but I havent seen one myself. I usually just charge y laptop with my phone charger, what is it, 18 watts? Don't care, charges my laptop and the phone that is plugged into it overnight. Why charge at faster speeds when there is no need to
Coincidentally, the USB-C spec is written such that wattage implies a minimum set of supported voltages:
* ≤15W charger: must have 5V
* ≤27W charger: must have 5V & 9V
* ≤45W charger: must have 5V & 9V & 15V
* (OT but worth noting: >60W: requires "chipped" cable.)
* ≤100W charger: must have 5V & 9V & 15V & 20V
(levels above this starting to become relevant for the new 240W stuff)
(36W/12V doesn't exist anymore in PD 3.0. There seems to be a pattern with 140W @ 28V now, and then 240W at 48V, I haven't checked what's actually in the specs now for those, vs. what's just "herd agreement".)
Some devices are built to only charge from 20V, which means you need to buy a 45.000001W (scnr) charger to be sure it'll charge. If I remember correctly, requiring a minimum wattage to charge is permitted by the standard, so if the device requires a 46W charger it can assume it'll get 15V. Not sure about what exactly the spec says there, though.
(Of course the chargers may support higher voltages at lower power, but that'd cost money to build so they pretty much don't.)
NB: the lower voltages are all mandatory to support for higher powered chargers to be spec compliant. Some that don't do that exist — they're not spec compliant.
It's a 3A supply up to the 100W one, that gets upped to 5A at higher voltages.
Varying voltage power supplies are usually capped by current, not power. That's because many of the components, set maximum current and voltage that you must obey independently.
At higher voltages people start accepting higher loses in stuff like cables, because fire-safety becomes a more important concern than efficiency. So the standard relaxes things a little bit.
You're correct but it's irrelevant. My point was that these requirements are in the standard and if you want to put the USB logo on a power brick you need to meet them. And the consumer is intended to be able to rely on them - which was & still is a pretty good idea considering the USB-C cable carnage.
I wish they did something like this for USB-C cables, but it's probably too late.
This has nothing to do with USB-C, this is the minimum design voltage of your lithium ion battery pack. In this case, you have a 4-cell pack, and if the cells drop below 2.895V that means they're physically f*cked and HP would like to sell you a new battery. (Sometimes that can be fixed by trickle charging, depending on how badly f*cked the battery is.)
If your laptop's USB-C circuitry were built for it, you could charge it from 5V. (Slowly, of course.) It's not even that much of a stretch given laptops are built with "NVDC"¹ power systems, and any charger input goes into a buck-boost voltage regulator anyway.
My laptop refuses to charge for 45W chargers as well, but I can almost understand it.
When plugged into 100W chargers while powered on, it takes ten minutes to gain a single percentage point. Idle in power save may let me charge the thing in a few hours. If I start playing video, the battery slowly drains.
If your laptop is part space heater, like most laptops with Nvidia GPUs in them seem to be, using a low power adapter like that is pretty useless.
Also, 100W chargers are what, 25 euros these days? An OEM charger costs about 120 so the USB-C plan still works out.
Other manufacturers do similar things. Apple accepts lower wattage chargers (because that's what they sell themselves) but they ignore two power negotiation standards and only supports the very latest, which isn't in many affordable chargers, limiting the fast charge capacity for third parties.
Which laptop is that? My Razer with 5070 will take 45W chargers just fine, so do the ThinkPads, my work 16" MacBook and previous Asus Zephyrus with 4070.
I was on a trip a few years ago and had only brought my “compact” 45w usb-c charger since the brick that came with my work ThinkPad (one of the high end 16” screen models, maybe p16?) was enormous. When I plugged it in Windows complained that the charger was insufficient to charge the laptop. I think it at least kept it from draining the battery though. I had to run to Walmart and get a 65w charger which did the job fine.
The idea is that you can use chargers that you have lying around. In an emergency I charged my MacBook Pro with an old 5 or 10W adapter overnight while shut down. I don’t see the reason for flat out refusing a charge. Especially when turned off.
With 802.3bt type 4 (71W delivered, 90W consumed), absolutely achievable with the proper electronics, but would you trust a no-name, fly-by-night NIC to not fry your expensive devices? That's the biggest hurdle. Possibly a company like Apple, Anker, or similar megacorp or high-trust startup could pull if off.
Doing home automation of lamps, sensors, speakers via PoE would be great too. It should faster and more stable than Zigbee/Wifi and with no need to change the batteries often.
PoE can be cheap, but usually never cheaper than non-poe. But if you have a PoE switch and spare ports, its very nice.
The problem comes when you try to design a large network and need random PoE ports on end devices where you can't home-run a cable back.
I have a Unifi Pro XG 48 PoE and I love it, but I still don't use PoE for everything. The cost of a (non unifi) poe device + the cost of using one of those ports always exceeds a simple power adapter on the other side (if possible).
What's wrong with "webapp to configure a keyboard"? It's the same as "app to configure a keyboard", just with another option to run it straight from a website, without installation.
It made possible when chromium-based browsers (Chrome, Edge, Opera) implemented WebHID API.
Now, if I couldn't download it and run myself, that would be a different story (vendor lock-in). But I can, so I think WebHID is godsend.
PS: By the way, the most popular RC (quadcopters/airplanes/helicopters) flight controller configurer is https://app.betaflight.com/ . It's a pretty complex tool with a tons of bells and whistles.
Remember jabber/xmpp? At least they tried to interoperate. Google Talk at the beginning had interoperability as its main feature, but Google quickly scrapped that.
UPDATE: some say that's because XMPP was too encompassing of a standard (if a format allows to do too much it loses usefulness, like saying that binary files format can store anything). IMO that's not the reason, they could just support they own subset. They scrapped interoperability for competition only IMO.
reply