Aceticon

joined 1 month ago
[–] Aceticon@lemmy.dbzer0.com 4 points 1 day ago* (last edited 1 day ago)

Whilst I agree with you in everthing but the first 2 words of your post, I think this is yet another "look at this cool gadget" post that overhypes something, and that is a kind of spam we get a bit of around here, even if nowhere near the levels of the Elon crap or even just US politics.

This is especially frustratingfor people who, like me, looked at the diagram they link from their article and found out it's pretty much the same as a run of the mill breadboard power adaptor with a USB-C connector and a slightly better design than the cheap chinese ones, rather than something trully supporting USB-PD (this thing doesn't even support the basic USB 1.0 negotiation needed to get more than 150mA when connecting to a proper USB host).

That the article then mentions a "crowdfunding campaign" for something that a junior EE can design with a bit of datasheet digging, carries a bit of a stink of a cash-grab, so seeing it as spam is understandable.

[–] Aceticon@lemmy.dbzer0.com 4 points 1 day ago* (last edited 1 day ago) (2 children)

If you look at the circuit diagram in their documentation linked from that article, that thing doesn't even support USB-PD or even just the USB 1.0 device side of the negotiation to increase the current limit from the default (150mA in USB 3) to high (900mA in USB 3). It will look like it works fine if you connect it to a dumb USB power supply (because those thing don't really do any USB protocol stuff, just dumbly supply power over USB connectors up to the power source's limit) but if you connect it to, say, a PC USB port (which does implement the USB host side of the USB protocol), your circuit on the breadboard that worked fine when using a dumb USB power supply with that breadboard adaptor might not work because the current it needs exceeds that default 150mA limit for devices that haven't done USB negotiation (worse if it's a USB 2.0 port, as the limit is lower for those)

This thing is basically the same as the chinese power breadboard adaptors you can get in places like Aliexpress, but with a USB-C connector instead of a Type-A, micro-USB or mini-USB one, plus its better designed (it has a proper Buck Converter instead of a cheap Votage Regulator, plus better power supply filtering and a polyfuse to protect it and the host from current overdraws).

The headline and the article seriously exagerate this "achievement".

[–] Aceticon@lemmy.dbzer0.com 17 points 1 day ago* (last edited 1 day ago)

TL;DR - It's a nice and pretty run of the mill breadboard power adaptor which happens to support USB-C connectors, but the article and its title insanely oversell the thing.

--

This is not exact as amazing an achievement as the headline implies since the necessary stuff to talk the to the USB PD host upstream is already integrated so you just need to get a chip that does it (and even without it, you'll get 150mA @ 5V by default out of the USB 3 host upstream and up to 900mA with some pretty basic USB negotiation in a protocol that dates from USB 1.0 and for which there have long been integrated solutions for both the device and the host sides).

Further, the converting of those 5V to 3.3V just requires a buck converter or even just a voltage regulator (though this latter option is less efficient), for which there are already lots of integrated solutions available for peanuts and where the entire circuit block needed to support them is detailed in the datasheet for that converter.

Looking at the circuit diagram for this (linked to from the article), they're not even doing the USB PD negotiation or any kind of USB 1.0 negotiation, so this thing will be limited to 150mA for a USB 3 host or whatever current your traditional USB power source can supply (as those power sources really just do power supply of whatever amperage they support over a cable which happen to have USB connectors, rather than including a genuine implementation of an USB host with current limiting depending on negotiation with the USB device, so such power sources don't require the device to do any USB negotiation to increase the current limit above 150mA).

This is really "yet another run of the mill USB power breadboard adaptor" only the USB plug is USB-C rather than mini-USB or micro-USB (so, a different plug plus a handfull of minor components as per the standard of the circuitry to properly support it), so pretty much the same as the cheap chinese ones you can get from Aliexpress, though this one uses a Buck Converter rather than the $0.1 Voltage Regulator in most of the chinese boards, and actually does proper filtering of power supply noise and proper protection against over current, so it is a quality design for such things, though it's not really a major advancement.

Without the USB PD stuff I wouldn't really say that it brings USB-C Power to the breadboard (in the sense of, as many would expect, being able to draw a proper amount of power from a modern USB 3.0 power brick that supports USB-C), more something with a USB-C connector that brings power to the motherboard, as that connector is really the total sum of what it supports from the modern USB spec.

What would really be nice would be something that does talk USB-PD to the upstream host AND can convert down from the 20V at which it supplies peak power, so that you can take advantage of the juicy, juicy (oh so juicy!) capability of USB-PD to supply power (up to 100W right now, which will be up to 250W with USB 4), though if you're pulling 100W (which at 5V means 20A, which is a stupidly high current that will melt most components in a typical digital circuit) from you breadboard power adaptor, then I'm pretty sure magic smoke is being released from at least one of the components on that breadboard and, by the way, you're probably damaging the power rail of that breadboard (aah, the sweet smell of burnt plastic when you turn the power on for your half-arsed experimental circuit!!!)

[–] Aceticon@lemmy.dbzer0.com 20 points 1 day ago* (last edited 1 day ago)

Worry not: in 20 years' time people born in 2028 will all pretty much look like kids to you.

[–] Aceticon@lemmy.dbzer0.com 2 points 1 day ago* (last edited 1 day ago) (1 children)

The way I read it, "well adjusted" in there is about being well adjusted to the society you live in rather than being a well balanced individual.

I think that meme is about understanding that some of the things that directly or indirectly make your life hard and cause you to feel bad as somebody who lives in this Society aren't really on you: for example, if you're a great artist struggling economically it really isn't your fault that the thing you're really good at is undervalued by present day society unless you have the luck, connections and self-selling ability to end up as a superstar.

That artist would be considered badly adjusted in today's society because of not being monetarilly prosperous (think about how people get frequently judged on the luxury of their car, the size of their house and the thickness of their wallet), or in other words for not becoming one of the 0.00...01% of artists that become superstars or getting a different job in a different domain once it turns out that what they do doesn't make much money, even if that person is doing great things with their art which make life a little nicer for lots of people.

I read that meme as: don't judge yourself or your life by the values of a flawed Society and don't think badly of yourself when you persisting doing what you think is right when that Society doesn't reward it, causes you grief and pain which you could have avoided by doing instead what that Society rewards more (or in other words, by being a more well adjusted member of that Society).

[–] Aceticon@lemmy.dbzer0.com 8 points 1 day ago (1 children)

I have a cheap N100 mini-PC with Lubuntu on it with Kodi alongside a wireless remote as my TV box, and use my TV as a dumb screen.

Mind you, you can do it even more easily with LibreELEC instead of Lubuntu and more cheaply with one of its supported cheap SBCs plus a box instead of a mini PC.

That said, even the simplest solution is beyond the ability of most people to set up, and once you go up to the next level of easiness to setup - a dedicated Android TV Box - you're hit with enshittification (at the very least preconfigured apps like Netflix with matching buttons in your remote) even if you avoid big brands.

Things are really bad nowadays unless you're a well informed tech expert with the patience to dive into those things when you're home.

[–] Aceticon@lemmy.dbzer0.com 6 points 1 day ago (1 children)

"It's either fully privatised Healthcare or it's Stalinism"

[–] Aceticon@lemmy.dbzer0.com -2 points 1 day ago* (last edited 1 day ago)

Socialism invariably fails and ends ups corrupted into some shithole authoritarianism decorated with leftie-sounding slogans. It is however meant to do the greatest good for the greatest number, it's just that in practice in the real world it's crap at it so it doesn't work because of human nature.

Capitalism doesn't even try to do the greatest good for the greatest number - it's quite literally The Sociopath's Credo: "do what's best for yourself and screw what's good for everybody else"

Ultimately they both fail at making most people's lives better, but Capitalism doesn't even try.

The best we've achieved has been Capitalism narrowly applied to just Trade and overseen by some other separate political theory that actually tries in some way to go towards the greatest good for the greatest number, such as Social Democracy, but as we've been seeing right now in realtime, with enough time Capitalism ultimately grounds down such bounds and oversight and corrupts everything.

[–] Aceticon@lemmy.dbzer0.com 1 points 1 day ago* (last edited 1 day ago)

Some current directions in AI, such as LLMs, seem to be dead-ends in the sense that those approaches cannot be incrementally improved much further to, for example, eliminate hallucinations or simply be capable of using logic along with those probability engines in such a way as to, at minimum, exclude the logically impossible from the results.

The dot-com stuff on the other hand was the very first bubble from the very first wave into a whole new technological direction that had just been unlocked and gave access to an entire technological branch of new ways of doing things - it the result of the very first wave of investment around the technology domain of worldwide digital communications and all the other tech branches that became possible due to it.

Basically the Internet was like openning a door to a various new areas of Tech (curiously that wasn't even all that amazingly complex as Tech goes, kinda like a basic wheel isn't exactly complicated but look at all that became possible with its invention), whilst the current AI wave (which is mainly the latest wave of work in the branch of Neural Networks, which is over 3 decades old) is more like a handful of massivelly complicated solutions which are the product of decades of work in a specific direction, some of which work in such a way that they can't be significantly further improved and hence can't be made to get past certain problems it has (the most obvious example being LLM hallucinations).

So whilst I do think that in 20 years there will be some prevalence of AI tech companies in some domains were the AI solutions of this wave of development on it do work well enough (say, entity detection on images), I don't think that will be anywhere comparable to what happened in the 20 years following the start of a new Tech Age which triggered by the Internet.

Mind you, 2 decades is a lot of time in Tech terms, so maybe somebody will come up with a whole different approach to AI in the meanwhile that breaks through the inherent limitations of the current one, just don't count on it.

Edit: just wanted to add that I was there when Darpanet morphed into the Internet and the dot-net bubble that came out of it. At the time everybody and their dog was playing around with making a websites, people were trying new stuff on top of those websites, inventing new comms protocols, wiriting programs that talked to other programs over the network, creating entirely new business models anchored on making a website a storefront - the Internet was Freedom. This AI wave doesn't feel at all like that - sure plenty of people are deploying models created by others and trying them out, but very few are creating new models and a lot of that Tech comes pre-monetised and locked down by large companies who are trying to get money out of anything people do with it - the whole things is not at all like the "we've open this whole new domain, you guys figure out what to do with it" that was the birth of the Internet.

[–] Aceticon@lemmy.dbzer0.com 1 points 1 day ago* (last edited 1 day ago) (1 children)

Having both a Degree (almost 2 Degrees, since I went to Uni to get one and then changed to a different one half-way, so I'm an EE with part of a Physics Degree) and at the same time being massivelly self-taught because of being a Generalist (to the point that in my career I went down the route of working in that which I learned by myself and did for fun as a kid - computer programming - which was not the focus of either of those degrees), it has been my experience that certain things - mainly the fundamentals - are close to impossible to learn by yourself in a hands on way.

Further, discovering by yourself the best way do something complex enough to require an actual Process is really just going through the same pains of trying stuff out or limping along doing it in a seriously sub-optimal way as countless people in the Past who battled the damned thing until somebody discovered the best ways of doing it, and worst, you're unlikely to by yourself figure out the best way of doing it even after years of doing it, especially as discovering new ways of doing things is a different process from actually doing the work - you have to actually take time out from doing the work to try new stuff out with the expectation that you might do a lot of wrong things as you try new approaches all the while not producing any useable results.

(No matter what, to learn new ways of doing things, you're going to have to take time out of doing work to do the learning - because it's pretty hard to figure out or just try out new ways of doing something without making mistakes and mistakes aren't a valid product of your work - and if you have to dedicate time for learning the most efficient way is to learn from somebody else, which means either a mentor or a teacher)

It's not by chance that even before Formal Education was a thing there was already the whole Master + Apprentices way of people learning complex domains (such as Blacksmithing).

Even with the Internet it's still immensely hard to learn by yourself complex subjects because:

  • Plenty of things you don't know, you don't even know that you don't know them - in other words you're not even aware they exist - so you won't go looking for them.
  • Most of what's out there is shit for learning. Formats such as Youtube optimize for Entertainment, not Learning, so you'll be fed by the algorthm countless loud dog and pony shows pretending to explain you things, all with about as much dept as a puddle, whilst the handful of properly deep explanations of things are algorithmed-away because they're too long and boring.
  • Worst, the most experienced domain specialists seldom have the time or the inclination to make posts explaining certain things, worse so for videos (and from experience I can tell you making a good Youtube video is a lot more complex than it seems until you try it). Further what you tend to see is countless posts and videos of people who learned just enough about a subject to think that they know tons about it (and thus can explain it to others) without actually knowing tons about it - in other words, people at the peak of the Dunning-Krugger curve. In other words, most such "teachers" are just slightly less newbie than you.

Last but not least, you're not going to figure out the Fundamentals by yourself. No matter how genial you are IQ-wise you're not going to, for example, rediscover by yourself the various Advanced Mathematics domains, because that stuff took centuries to figure out by the most intelligent people around, often whose only job was to discover things.

So yeah, some things can only be learned from somebody else, the bulk of what you have to learn is much faster to learn from somebody else than by yourself, and since Formal Education with professional teachers is way more efficient a process than apprenticing under a Master (plus it is way broader in what you end up learning, though less deep than learning from a master/mentor) that's pretty much all that's available.

Personally I think a mix of formal Education, Mentorship and Self-Learning is the best way to learn complex domains, but it's pretty hard to find yourself in a position were you get a Mentor and as one who often is one in my area, I can tell you I wouldn't waste my time mentoring somebody who doesn't even know the basics (for example, because they shunned formal education) when I could be mentoring somebody ready to directly learn the advanced stuff I know which is what's worth me spending some time teaching.

[–] Aceticon@lemmy.dbzer0.com 2 points 1 day ago (1 children)

Ah, right - I misunderstood that point you were making.

So, as it turns out, we've been in agreement all along.

[–] Aceticon@lemmy.dbzer0.com 3 points 2 days ago* (last edited 2 days ago)

Whilst quite a lot of words are pretty much the same in both languages, "wie" in Dutch means "who" whilst in German it means "how".

Having learned Dutch first, I can tell you that when I was first learning German the expression "Wie geht's" tended to give me a serious mental hiccup when I was trying to talk to German people.

view more: next ›