

Allegedly…he hasn’t admitted to anything afaik… and as far as I’ve seen the evidence is flimsy …why would he escape and then hang out in a fast food restaurant with the disposable murder weapon?
Allegedly…he hasn’t admitted to anything afaik… and as far as I’ve seen the evidence is flimsy …why would he escape and then hang out in a fast food restaurant with the disposable murder weapon?
Hmm, not so sure. He produced a digital signal, who’s spectrogram happened to be an image, and then played that digital signal to a bird. Dunno if a analogue spectrogram really even makes sense as a concept. The only analogue part of the chain would be the birds vocalisations, right?
That doesn’t sound great. What benefits do you see in mirroring this behavior
Well, guess I can’t deny such compelling evidence
As much of a prick as this guy is, I don’t think that’s true. The behind the bastards episode on him couldn’t substantiate it at least
Reverse proxy with mTLS in front might be a simple solution depending on your setup
Yup, that’s what I was alluding to, while it may not still be the case for transistors, they did manage to take 50 odd years to get there, push that trend line from the figure 50 years heh (not saying you should, 5 seems much more conservative)
Take a look at Nvidias pace wrt Moore’s law (of FLOPS) https://netrouting.com/nvidia-surpassing-moores-law-gpu-innovation/
Or like looking at the early days of semiconductors and extrapolating that CPU speed will double every 18 months …smh these people
They were invented *by 9k bc :)
Can you go into a bit more details on why you think these papers are such a home run for your point?
Where do you get 95% from, these papers don’t really go into much detail on human performance and 95% isn’t mentioned in either of them
These papers are for transformer architectures using next token loss. There are other architectures (spiking, tsetlin, graph etc) and other losses (contrastive, RL, flow matching) to which these particular curves do not apply
These papers assume early stopping, have you heard of the grokking phenomenon? (Not to be confused with the Twitter bot)
These papers only consider finite size datasets, and relatively small ones at that. I.e. How many “tokens” would a 4 year old have processed? I imagine that question should be somewhat quantifiable
These papers do not consider multimodal systems.
You talked about permeance, does a RAG solution not overcome this problem?
I think there is a lot more we don’t know about these things than what we do know. To say we solved it all 2-5 years ago is, perhaps, optimistic
Unfortunately not, here is a little kitchen sink type demo though https://myst-nb.readthedocs.io/en/latest/authoring/jupyter-notebooks.html
Myst-nb is probably the place to start looking btw - forgot to mention it in previous post
I use sphinx with Myst markdown for this, and usually plotly express to generate the js visuals. Jupyterbook looks pretty good as well
Feeding the troll 🤷♂️ “agenda driven” what does that even mean 😆
No one said other languages aren’t allowed. Submit a patch and prepare yourself for years of painstaking effort.
Something makes me uneasy about this being a Google sheet, you need to use credentials to view it and someone has a log of who has accessed it…you can probably even see who’s viewing it in realtime
Use an anonymous account! Or someone should host this on a website or something with higher privacy
deleted by creator