{"text": " Last week an AI would like to thank ODSC AI for being a sponsor. ODSC is one of the longest running and largest communities focused on applied data science and AI. It started over a decade ago with a simple idea, bringing practitioners together to learn from people actually building and deploying models in the real world, not just talking theory. On April 28th through the 30th, you can experience it yourself at ODSC East, 2026, taking place in Boston and virtually there will be thousands of hybrid attendees ranging from data scientist ML engineers, AI researchers and technical leaders. You can attend over 300 sessions covering LM's, Gen AI, computer vision, NLP, data engineering and more. You can also go to hands-on training with workshops and boot camps taught by experts from companies like OpenAI, Hugging Face and Video and other top companies and universities. And of course there will be a massive expo and networking opportunities great for startups, hiring managers and AI tool builders. It's one of the best ways for AI practitioners and teams to stay ahead of a field that learn from a best and connect with a community. Go to odsc.ai slash east and use promo code LWAI for an additional 15% off your pass to odscai east 2026. That's odsc.ai slash east and use code LWAI to get an extra 15% off on the number one AI builders and training conference. We'd like to think factor for sponsoring last week in AI. Not related to AI but I am personally a big fan. Often I have no time to cook and factor makes healthy eating easy. They're fully prepared meals that come from their dishes and crafted by chefs. Actually used it for many years and I think you can really eat well without the planning or the cooking using factor. They use quality functional ingredients like lean proteins and colorful veggies and there are no refined sugars or artificial sweeteners. You have 100 rotating weekly meals to keep things fresh and you can choose types of meals like high protein, calorie smart, Mediterranean and others. It's really convenient ready in about two minutes. In my experience really is fair to say that there's no prep necessary and it's quite good. I've used it for many years and I think you might want to consider it if it fits your lifestyle. You had to factor meals dot com slash LWAI 50 off and use code LWAI 50 off to get 50% off and free breakfast for a year eat like a provisement with factor. You serve coverage only veg by plan. What free breakfast time per box for one year while subscription is active. At Arizona State University we're bringing world class education from our globally acclaimed faculty to you. Earn your degree from the nation's most innovative university online. It's a degree better. Learn more at ASU online dot ASU dot edu. Hello and welcome to the last week in AI podcast week in the air chat about what's going on with AI. As usual in this episode we will summarize and discuss some of last week's most interesting AI news. You can also go to last week in dot AI for our newsletter with even more news every week. I'm one of your regular hosts, Andre Crankov. I studied AI in grad school and now work at the startup AstroKate. I'm your other host, Jeremy Harris, from Gladstone AI, AI national security, AI loss control, all those fun things. By the way, special thanks to Andre for recording at this time. We bumped things up even earlier. I think it's like what is it seven thirty year time? Is that it is here on the PST. Sometimes it's nice to be East Coast to be a bit later. But you know it's good to get your day started early sometimes. Yeah, much appreciated. Also appreciate people tuning in or watching the entire last podcast, which featured a extra like hour plus, which I didn't realize it was that long, but Andre had a hop off. So I just went through like some of the technical papers we didn't cover as an experiment. And man did I go on. So I have learned that I need Andre to be like the regularizing term to my loss function. If that is. Yeah, we do know some people are very much fans of the coverage of research and I was going in depth. So feel free to comment on YouTube or elsewhere. And yeah, say if you want more of that, you've considered maybe having additional episodes that are just research, very much happening there. But feel free to let us know if you'd like even more research on a regular basis. And just to give a quick preview of what we'll be doing this episode, not as much research as last one. There's a bit of everything I suppose. There's a couple new model releases and some other kind of interesting tools. There's some interesting developments on the business front with OpenAI. And then we've been covering a lot of safety and interpretability work lately on alignment. So we'll have some of that. And then towards Van, we'll have some fairly interesting impact for looking research. Let's feel radical more sort of like wow, this might actually be a big deal. So it should be a pretty fun listen. And let's kick it off with tools and apps. First up, OpenAI, they have shipped a Jubilee 5.4 Mini and Nano. They are similarly to other small models that we've seen in, in recent times, like actually really quite good for being kind of a smaller range. Jubilee 5.4 Mini is close to Jubilee 5.4 on several benchmarks, including on SWE Bench Pro and OS World. They're fine. And it's more of and twice as fast. Jubilee 5.4 Nano is obviously the smallest option. It's not really doing great on the benchmarks, but it is super quick. These models have 400,000 token context middles, so fairly substantial. But they do cost a decent amount, relative to GP5 Mini. Looks like GP5.4 Mini costs free X, GP5 Mini, and GP5 Nano also costs more. So on the whole, good, faster, smaller models, if you need something that's doing better in GP5 Mini, now you have that option. There's been a lot made about the cost situation and the pricing of the per token pricing, I should say. It is higher. There's no question, right? So GP5.4 is basically three quarters of a penny. Sorry, three quarters of a dollar per million in put tokens versus 25 cents for GP5 Mini. So that's a three X hike. But opening I says it only burns about 30% of the GP5.4 in codex. So it's actually going to be much more token efficient. And this is a metric that I think matters a lot more than many people will tend to realize, right? So we have lost and like cost for token. But as we've seen, more tokens at inference time does not necessarily mean more performance. And that's the big catcher. So when you multiply those together, right? 30% times three acts, you actually get a slight decrease in what you might think of as cost per performance, which is a little closer to what most people care about. And this will vary depending on the workload, but still quite interesting, right? So for the sort of orchestrated agentic tasks, the effective cost per outcome could actually be favorable compared to running the full model. So it's a sort of interesting there. One thing I will say nano is API only. This one is priced at 20 cents per input and $1.25 per million output tokens versus like much, much cheaper. So it's predecessor was 5 cents per input token. That's a 4x lift, roughly 4x again for output tokens. Well, so you're looking at 4x overall. The weird thing is, so opening I is pitching this for classification and data extraction, which are these like very high volume workloads where you're usually quite cost sensitive because you're processing you're so much. And so that four full hike is going to sting the most for exactly the people who are being pitched this product. So it's a bit of an interesting position. It seems like a little, I don't want to say at odds with open AI's position that they want to make intelligence too cheap to meter, but it certainly is at least locally. This is a move towards instead of racing to the bottom on inference costs. We're going to focus on model quality. And that that's going to be our big differentiator. You're going to care that we can get the right answer, not that we can get it cheaply, which is where all the margin is. That at least is what anthropic certainly suggests and what we're seeing elsewhere. So that's kind of interesting. You know, there's a bunch of interesting, as you said, OS world verified is interesting benchmark to look at here specifically. I know you mentioned sweet bench and a couple others. And so this is basically a computer control benchmark. So it looks at how well can the mini model just control a computer. And what we see here is a GPT 5.4 mini hits 72%. If you look back at GPT 5.4, the full version, it hits 75%. So it's actually pretty close. And if you're thinking about the previous GPT 5 mini, that was only 42%. So pretty big jump, especially given that we're getting up there on this benchmark in terms of saturating it. So all in all, pretty interesting release, the very lead, not very lead, but the very detail here really is that token efficiency question. What kind of workload are you going to use this for? That's going to determine, I don't want to call it like the total cost of ownership, because that's not quite the right metric here, but the total cost that you're exposed to in the ROI, that's really becoming a key thing here. Right model for the right workload is just going to be a critical dimension, at least for the next few months. Right. Yeah, I think these kinds of things showcase the fact that, you know, all these models kind of came out of the world of academia, right? And benchmarking is largely focused on capabilities on how accurate your model is. And so usually you don't highlight these kind of more practical concerns of how quickly can you finish a task, right? How cost effective are you like you do a task, how much dollars does it take? Wall clock time, right? We just don't get these numbers, at least, on the announcements. It's honestly a bit surprising that's still the case, but it's a culture question, I suppose. And as you said, we'll be discussing opening a strategy in just a bit in the business section. Very much in line, we're on topic where we're like, we'll just charge more for our models, but they're the best. And so people will follow it. And speaking of small models, next up, you've got me Straul. They have released their small four family of models under the Apache 2.0 open source license. And it combines actually multiple things. So it has reasoning built in. It has multi-volonial capabilities built in and it has a genetic coding optimization. So they combining magistral, pickstrawl and devstrawl. It's also used as a mission of experts. So they've have a total of 119 billion parameters, but only six billion active parameters per token. That's quite small. You can fit it into probably one top end GPU. You know, it's actually quite affordable. So it looks like possibly a pretty strong model on in this category of smaller, faster, cheaper, and open source. Yeah, challenge there is going to be, you know, if you get into wanting to fine tune it, obviously, then, you know, now you're you're dealing with really a 120 billion parameter model. And that's, you know, pain in the butt. But yeah, I mean, the only six million active, I mean, this is really ultimately a bet that you're going to have kind of like model sparsity in the sort of very aggressive sparsity ratio is going to perform a dense model or something, you know, more traditional. And it's a bet as well as you say on the kind of hardware that is available at least for inference on local machines. So yeah, it's quite interesting. It is a more aggressive kind of fewer active parameters per token type of play than we've seen before. It's also kind of interesting. So you touched on the consolidation, right, of all these models under a single, in a single model, right, reasoning, coding agents, multimodal, that's a pickstrial, like looking at images and text and so on. So this is a weird play because historically, Mr. Alt has separated these, right. And so given they're collapsing them into one model just with a reasoning effort dial, you could see that as a pretty big bet on where the industry is heading. That is something that we've seen with, you know, obviously the O series of reasoning models with GPG 40 even starting as far back as that. If it's the case that we get positive transfer, that's really what this is a bet on, right, that a model that is trained to do all these things will do better at each individual thing because it's kind of getting cross training, right, the same way that you might want to do, you know, football and like ballet at the same time, rice hockey, and that you get better at each different thing because of the combination. That's kind of the idea here, right, that positive transfer that for so long was was really difficult to pin down. People were seeing negative transfer where the more stuff you train a model on, the worse it does on the marginal additional task. We're now well into positive transfer territory. The extension of that to reasoning is quite interesting and implies that reasoning may, at least that they're betting, that reasoning may eventually were already play a role in analyzing images more successfully for these kinds of models at this scale. So there's another piece here that's interesting, you know, this efficiency claim. They say that small four is achieving scores that are like on par with GPT OSS 120B on a bunch of benchmarks while generating like shorter outputs on at least one benchmark. And so again, this is about that token efficiency question, right, we're sort of like back at that very underrated metric of output length efficiency and the fact that you don't necessarily get more benefit by reasoning with more tokens, that's something we started pointing out when we looked early on at the kind of deep seek reasoning results, right, where it's like, yes, you do get this positive uplift, but your value per token is actually like potentially going down. And we are now, in fact, in that regime, we're seeing that very clearly. So this efficiency piece is really important because if people are going to use open source models, they're going to run them like by far, the biggest use cases running these on whether you're own or other people's clusters to serve customers. And so, you know, like how good Mistral is at making this an efficient reasoner speaks directly to your bottom line as the person who's going to be serving these or asking somebody else to serve them for you. So pretty interesting, you know, the fact that they're comparing to GPT OSS 120B, look, the space moves really fast. That is an old model at this point in open source terms. It's kind of like choosing your point of comparison pretty selectively, I would say here, but they do a comparison that's reasonably favorable on point models as well. So it's an interesting play. I think, I mean, I'm a little concerned from Mistral. I have been for a long time, obviously, don't know too much how this ends up playing out with the open source play, but here they are. It's a reasonable model and the consolidation angle, that's a really big story here, right? If everybody is starting to consolidate, even open source players here around one model under one roof, that's a materially different story. It does not, by the way, extend necessarily to the world of agents, right? Sub agents may be smaller, cheaper models. This is more about, you know, if you're looking for a highly performant model, I think of the main orchestrator, you're probably, it seems, you're probably going to be seeing, you know, models that that kind of put, put multiple capabilities under one roof. So something to watch out for. Right. I'll just quickly comment on the unification bit. It's better to say that it's partially a bet on what things are heading with a multimodal aspect. The fact that they have baked in reasoning and coding, I think, is a little more just an indication of catching up with where things are these days. It used to be the case that you had a reasoning model, like, oh, free. And with deep seek R1, you trained a reasoning model and you had your base model. And what everyone moved to in 2335 is there's no reasoning model. Your model has reasoning baked in. And now with, with Sonnet, with GPT, really post training for reasoning, it's been very clear that you should just train your model to be a good coder because that makes it a smarter model in general. So this is a mix of catching up where our rings are at and also adapting that multi model capability, which is could be interpreted in several ways. Next up, Meta's Manus launches my computer to turn your Mac into an AI agent. So this is something you can install and launch on your computer. And it's effectively like having a little open claw, I guess, on your computer. So it can execute command line instructions that sit interact with computers. So it's very much, I think we've seen this happening more and more where various organizations are shipping open claw, ask things where you have an agent that just lives somewhere and you can tell it to do stuff. And it is your like assistant or your AI or whatever. This appears to be another instantiation of that similar also to perplexities announcement of what was it like perplexity computer. Yeah. So very much in line with that. Oh, you mean you were enabled to remember like the sixth new open claw variant that got launched by a company in the last two weeks. Yeah, it's crazy, right? We're really seeing more and more of this like pylon right from all these all these different competitors in this space. And this is a land grab like Magnum mistake. This is the the sort of man, I don't call it the scramble for Africa moment, but the you know, historically equivalent of that in this space with fewer controversial overtones. This is the moment where people are realizing, hey, you know what? We need to get on people's local machines, right? We have to get some kind of piece of that pie because so much of what's what's happening right now is that we're trying to like grab onto people's like agentic like the the kind of agentic runtime layer. In fact, in video, we'll talk about this in a minute with Neemoclaw. It's the same thing. Everybody's trying to get on to like what is what is the substrate on which agents are going to run? Can I get my dirty little hands on that and and turn that into part of my market? And in this case, the menace play is quite interesting and it's aging well, right? I mean, menace was an in it like totally independent company before the acquisition. And really the play here is meta extending its reach into that that local OS layer for the first time. It's this is the territory historically of you know, your apples, your Microsoft, your Google's. That's the war they're entering. We haven't seen meta do that before, right? We just haven't seen them try to play in the operating system game. This is a really interesting way for them to vector into a completely different market using resources that you know, maybe for the first time they have like a credible, whether you call it advantage or not, they have a credible play here. So pretty interesting. And again, meta classic history of buying their way into competing, right? We saw this with WhatsApp with Instagram like it, it never ends. This is their play in that direction. So, menace, you know, maybe aging well, we've got to see what comes of this. came out I believe last year initially with a cloud based agent but you could assign to go do stuff so this is them extending to your local computer right now you can I think get this for silicon based max the other aspect of this by way is not just open claw it's the core work the codex angle where they most of the blockpost actually is highlighting like let the agent go to computer and organize files and do things that are very much cloud code or or now core work ask and we kind of tell it to do stuff from anywhere at any point is an aspect of it's the opaquaspect but I think the real land grab is for that core work type like have an agent do stuff for you which is now like everywhere in coding but I think what or on topic and now open AI and now everyone is realizing is these agents can like do a whole bunch of stuff and that people haven't adopted to do yet and speaking of open claw next up and video has announced Nima claw as part of their announcements at gtc which is a little bit of funny this is a stack for the open claw agent platforms that allows you to install and video name-atron models we just discussed this latest name-atron model last week and there's a new and video open shell runtime you install both of those in a single command and you get privacy and security controls baked in making it possible to have you know more more confidence in running one of these things we've seen many stories of open claw go in raw it's right absolute confidence yeah so open shell provides an isolated sandbox that enforces policy-based security network and privacy guard rails for agents seems like a good idea if you are to do one of these agents maybe install them in a sandbox where you can control them and they don't go rogue and you know take over the world yeah and one of the the key dimensions here you know you think about what does sandbox mean you know how do these things work typically a lot of these things focus on what is the model that is on the running on the cloud and what are the models for model that's running locally on your machine right so you imagine you might not want the model that runs locally on your machine that's actually looking at your own intimate files to have direct access to the internet right so that's kind of like one way that you might enforce that sort of guard rail so use that local model just like generate summaries or do analysis and stuff and then just ship the summaries after some review or something like that to you know that's like that's one way to to play that game and there are a whole bunch of other guard rails around the kind of access that different models can have in your ability to net so that's kind of like a lot of where this is coming from you know this is a classic example of jensen's hyperbolic rhetoric right he's he's using terms like new renaissance and software to kind of to play this up which to be clear I actually I mean I agree with this but there's like there's a gap between you know this framework can help you install you know a single demand and redefining how computing is done and we'll see if that that chasm gets gets crossed and it it probably well at some point man I think it's certainly well at some point question is who does it first so the frame here is you know again that operating system piece right keep going back to this it's not a coincidence that everybody is is going on the same basically this gold rush expedition everybody's thinking the same thing the operating system for personal AI that layer and again jensen here comparing mac and windows for PCs to open cloth for personal AI that's a deliberate a bold clamp a very deliberate this is part of that frame this is Nvidia now saying hey meta is going to get into the effectively the operating system game the the sort of operating system for agents we're going to do the same thing we don't have history quite of of doing that but you know now we're diving into so you're creating creating this environment where because essentially agentic you know we had the software eats the world era of the sort of sass revolution you know in the last decade decade and a half two decades and now we're in the AI is eating the world including software and another layer of abstraction here is yeah that runtime environment for agents and and that's the land ram that's the operating system at least this is the case that jensen's making I think it's a reasonable case on the whole but remains to be seen this is functionally also a classic Nvidia play of like trying to trojan horse in their full stack so nemoclaw is going to install nematron models right those Nvidia models preferentially and the open shell runtime all in one command the simplicity here is the point so it's super super easy to deploy Nvidia's own models together and the runtime environment all that stuff basically they're creating a situation where just like they owned kuda for the the GPUs they're creating a whole software stack around agentic the agentic runtime environment and so this is what's worked for them in the past create a whole ecosystem around this software is more commoditized now that it has been before so that may not be the same kind of moat especially given that they're you know unlike before with kuda where they had like a decades head start for anyone really paid attention this is now you know much more in competition with kind of faster moving players so really interesting again chocolate up is another another entrant in the sort of operating system for the agents sort of catalog here right I do think interesting aspects is nemaclaw is is kind of like the hype really like good cool branding for the actual major push which is their open agent development platform which is that terminal based thing it also comes with this Nvidia AIQ blueprint thing which is built on top of length chain deep agents and has Nvidia Nemo agent toolkit as these open source kind of options for building your stack of agents so nemaclaw aspect is like digging back on the thing that everyone is hyped about and then the actual software frameworks is probably the actual thing that Nvidia cares about yeah and this is it right it's like how can I hook on to one big hype train and then exactly use that as a Trojan to get her like everybody dependent on her whole stack including their models because there hasn't been the kind of uptake of the nematron models at least that I expected to see so far and so this is you know one one opportunity for them to make that happen yeah I think to your point about software kuda obviously is is the number one thing for GPU related kind of execution software but as far as open source packages go and video hasn't had a history of really making an impact so for instance length chain is an open source package that goes back to 2023 that has been a popular for building complex graphs of LLM prompts and agents and so on probably overly complex but anyway it was at an gained broad community adoption so that's more or less been the pattern a lot but with agents and now open claw and whatever that was a good time to try and get in on that open source kind of game and one more story about in video at gtc they announced dlss 5 which is what they're describing as the GPT moment for graphics so this is runtime enhancement I suppose for game graphics so this uses machine learning based upscaling and it applies generative AI to add a whole bunch of just really nice looking graphics so if you look they have various examples so you probably want to just see it to understand it basically you can go to older games like other schools oblivion for instance and get really cutting edge seeming graphics with this turn down now the reaction to has been very much split from what I've seen people have often sculpted it and we're like oh this is AI slop this filter is is bad in some cases it seems to go against the style of the game somewhat so Nvidia did also emphasize that this is fully controllable by the developer the developer can set the settings of how aggressive it is you know how much it impacts various aspects of the rendering yeah I think this is the kind of thing that it has given developers presumably in the past this is dlss 5 but with the generative AI aspect of it it's getting a lot more discussion and it looks potentially very very significant well so I was going to ask you I mean this is the part of the show where I ask you for your opinion as a guy at astrocade like is this something that you guys would integrate in your stack like what how do you how do you think about new tools like that or is that even part of your your workflow yeah so this this is kind of targeting that 3d triple a big game kind of market right so this is dealing with new games right then and more complex you wouldn't run this on your phone like for casual games but for games that have complex character models with faces and stuff like that or like open world games where you traverse big landscapes that's where the kind of photorealistic pass really makes a big difference and last up we have an update on open areas plan to launch strategy parties adult mode this has been announced I think last year as I think they are thinking of doing it has been delayed from the original late March target and it seems that they're still aiming to do it the news here has been that the team within open AI their advisory council of psychology and neuroscience have opposed it at a January meeting one advisor really warned about it significantly so anyway we have a quick update saying that they appear to still be planning it it has been delayed but as of now it's still going to be presumably released yeah well what a surprise that despite objections over the appropriateness of a tool like this that they still went ahead huh that's that's weird that's not my company at all isn't yeah that's weird what a weird thing I know this is when you think about the things that were classically warned about just my opinion here personally but I seem to remember an awful lot of people warning us about the brave new world thing where you're hooked up to like the dopamine drip yeah I don't know ultra porn powered by like AI super intelligent like I'm not sure how far you push the inference time compute budget before you get to that scenario but hey how about scaling laws for porn addiction how about that paper looking forward to that coming out I mean look that that's the kind of thing we're gonna have to see there's gonna have to be studies on scaling laws for porn addiction just calling a spade a spade like I don't see how we avoid that it's interesting that this is this is an offering someone's gonna do it right the big porn companies at some point whatever so you could argue and I'm sure this is part of the ethical argument that opening I might make here is look this way we can monitor the use of these things ahead of time understand how people adjust to this technology maybe try to mitigate risks ahead of time whereas porn companies probably wouldn't do the same thing sort of a similar argument to what Sam's been making historically right we want to like get these things out there because it's gonna happen anyway so we want to get that feedback and be able to iterate on it which there's merit to but yeah we should be under no illusions that we're crossing the Rubicon here and I think given that open AI is taking this step it is on them then to come out with unbiased research that transparently looks at the effects of exactly what they're putting out there right I think that's just a reasonable thing I think any I'd be pretty sure it's like a tobacco company going out and doing something when you're playing so close to the the base of the human brainstem you're like you're in different territory and so anyway it we'll see if this persists we'll see what scandals come of it I'm sure there will be some but yeah it's in some ways not surprising I don't want again rip on open AI too much for this because the argument is true that at some point you know the big porn companies will do this and they have a history of like mistreating women and you know doing all kinds of awful things so ethically I get it but now the proof is gonna be in the pudding because once you put that out there and brand-wise like geez I well so I think it's worth clarifying a little bit porn might be a bit strong of a term here right so they do clarify this will not generate erotic audio images or videos this is really about so they've banned erotica as a as a general category with the chatbot and so if you try to write sexy stories or have sexy role play which you can do and plenty of other you know apps out there is a million AI girlfriends including XAI is right so this is more of that this is like role playing with sexual kind of adults only component to it not straight up like explicit content of the sort that you might think when saying porn so in some ways yeah I think there's arguments to be made out of a way like people have tried to do this and tried to do this as is most likely with these models people do seem to want it there's very large adoption of these things and we already have plenty of stories of people getting AI girlfriends and having psychological impacts so yeah maybe actually if chat GPT and open AI do this explicitly and do it right and do it well it's better than not doing it and having some shady dark market for it you know no absolutely and I take back the the use of the term porn here I guess I'm sort of seeing where the the puck is headed and anticipating that next play this is definitely a step where we need open AI to actually run the studies since they're going to have access to the data and no one else will right so like we we need people to look at this I hope that they will I would imagine this as part of the package here but from a branding standpoint this is really dicey especially at a time when they're trying to redouble we'll talk about this in a minute but on this whole sort of business productivity focus like that's going to be their big play so you're kind of adding that into the mix it's interesting I mean what what this implies about the user base that they're they're going after is is there ROI here I mean porn is a pretty low margin industry as I understand it's so so this like I don't know if role play erotic role play would kind of be similar things flirting if you will with that but either way yeah we'll find out I think the argument that a lot of people want this is also kind of dicey I mean wow a lot of people want crack cocaine like it's a so we got to find a way now everything's out of continue I'm being tongue in cheek here obviously but we have infinite inference inference time compute budgets coming online in the next you know three to five years or whatever at what point do you just have so much inference time computing dedicated to your your limbic system that for all intents and purposes like you're robbed of your agency I think we're actually going to be closer to asking ourselves that question sooner rather than later than most people are pricing it right and then just to reiterate the actual news here is just that there have been these details coming out of the internal objections from certain parts of open AI and these objections have been there since January and the company hasn't canceled this feature features it out they could still not do it and it has been delayed so it's still on track and it'll be interesting to see if they do go through it after all yeah I know that's true and actually like a lot of my reaction is reactionary I'm like this this direction generally is definitely coming but it makes me nervous as hell and that's kind of that's sort of my my response to this is you know we're just sort of seeing a lot of this is testing the waters too I'm not saying this particular story was leaked but you see governments do this you see private companies do this there's like leaks of stuff to go out to just gauge audience kind of consumer response to things can we take that step can we not I don't know again if this is or is not the case here but but man I mean that you'll at some point need some kind of immune response to things that push in this general direction and on to applications and business sticking with open AI and coming back to a topic we briefly touched on another thing that came out from open AI this week is a discussion that happened at an all hands where effectively open AI has signaled a strategic shift away from doing everything doing a little bit of everything and focusing more on productivity and business so you know open AI has historically had a million side projects this I think is how they characterize it side projects they have video models they release the soar app they have a browser they have audio models I believe transcription like a million things on fabric has clawed and that's it open AI has sorah and their audio models and and so on and so on so it seems to be that internally within the company they are changing focus the head of applications Fiji Simo has told staff that they cannot miss this moment because they are distracted by side quests and they need to nail productivity and you've seen this already for the last several months it has been apparent that they have been really pushing on codex to catch up with cloud and and co-work as well and they largely have at least in terms of feature set in terms of adoption I mean many people are starting to use codex some people prefer codex at this point but it is obviously true that this hasn't been open AI is primarily focused until recently and they have kind of been behind the cloud and in terms of adoption they are still behind because cloud code is the first early leader in the category so yeah interesting to see open AI having that internal discussion now I feel like this has been a problem with open AI for a while if I were to kind of guess at internal dynamics and kind of in business and company level issues that lead to poor performance so yeah kind of could see this coming I think famously and and Andre I'm sure you'll have friends that have told you the same thing like at Google everybody in mind who's ever worked at Google says the same thing and actually same meta you get promoted for building new stuff right it's like it's not about did you make the code run more efficiently did you clean this up just clean that up it's like did you make new stuff and that's why there's a massive app grade of yard right famously for Google products you know Google hangouts Google this Google that that just like gets axed in at various stages there's this fundamental There's this fundamental question of like, again, is this a feature bug, right? You can look at Google and you can say, ha ha, look at the grade of yard of waits a time. You can also look at Google and say, well, what matters is not the misses, what matters is the hits. And for every, not for every Google Hangouts or dead Google product, there's like a Google calendar or a Gmail, right? Or maps, right? Or maps, that's right, that's right. So like Paul Buhheit, famously, you know, now a YC partner, I don't know if he's left, YC, he was there when I was there. But he like was the founder of Gmail and his entire like claim to fame and Silicon Valley is that he was the Gmail guy. That was a company within Google. That's the way to think of it. And certainly from a revenue standpoint, that's what it is, right? So when Sam, who comes out of the YC ecosystem too, right, having been the president of Y Combinator for many years, the first one after Paul Graham, his whole thing is spray and pray, right? He's used to betting on us like a little bit of money on a lot of efforts at the same time. This goes way back to OpenAI's founding days, 2015-ish, where he was doing the spray and pray thing on a whole bunch of different things, you know, evolutionary methods and robotics and RL and all this stuff. You know, OpenAI's gym was originally a side project, right? Now it's a big thing. So there's a whole bunch of different plays that you know, he sort of like used to playing in this way. It's worked from the past. The challenge is now you're entering a much more mature environment, right? You're no longer necessarily in the game of betting on a bunch of different startups. You have a kind of core area that you need to dominate right now. You know, anthropic is running away with the pod. I think it's over 70% market share right now in enterprise. So that's what is the red alert here that Sam Altman is calling in Fiji Simo. So they've got to find a way to kind of like, yes, you got to keep that Google or that YC kind of approach to investing a whole bunch of things. You never know when the next big thing is going to come from. But you also have to be able to invest and double down in things that are working that are generating a lot of your revenue. I mean, 25% market share on enterprise is a big, big deal. You know, they're making what, $25 billion of annualized revenue at this point. Like they have stuff that's really working. And so this is a structural shift. There's another piece that's happening here where I think there's an over rotation on, oh, well, opening eyes like, you know, kind of throwing out all these side projects. Sam wrote on X a couple of days ago. He's like, look, there's all these rumors that we've also canceled the hardware thing with Johnny Ive. And in fact, this article sort of repeats that rumor saying that, hey, they, you know, they basically canceled the roll out of these AI powered earbuds. Well, actually, at least Sam says the Johnny Ive thing is still live. And so again, it's a question of what about the hits not necessarily just the misses, but some of the hits now are worth protecting. And that's a challenge. If you start to grab the pot, Sam needs to find a way to hold onto it and expand his territory. It's not just a land grab. Right. And I think to be fair, Cloud Code initially, the story goes was a single person's kind of side project and it turned into a massive thing because someone pursued it. Really, what this perhaps indicates is the various bets that open eyes have been making with the browser, with the SOAR app. They haven't seemed to really be a hit or like BS. Nearly as transformational as Cloud Code. So they missed out on the Cloud Code moment. They started catching up late with Codex. It wasn't until kind of pretty late into last year that I started feeling like they're really taking it seriously. Already by mid year, if you were kind of feeling a pulse of where AI was like people were calling Cloud Code. And you know, I adopted it sometime around May or June. Codex didn't really start kicking off until maybe even November or later. So yeah, it really signals kind of more, not necessarily not doing these other things, but realizing that they were left behind and now we need to catch up and start making money with businesses because that's as we often say where you make your money much more easily than with consumer products. Yeah, much more, you ultimately need to go straight to consumer to get the big pot down on others. But it's early in the get like it is always early in the game and AI and to your point, I mean, this is, and I can't remember if this is a Napoleon thing or an Alexander the Great thing, but people would say like he could achieve a victory but not use it. This is kind of that, right? There's two different modes in the space. First is getting your product market fit and then it's holding on to it in the face of competition and then kind of attacking other people's modes. And that second one is it's a shift. It's a fundamental kind of mindset shift that you got to get into and it looks different for AI and for traditional SaaS companies too. Like you just can't sit on a stack of software for like a month and expect it to hold on to market share. So yeah, it's a really interesting space. We're learning a lot in real time right now about what it takes to gain market share and hold market share in a world of low switching costs and very rapid advancement. Next up, moving back to GTC. We have some more details. One other thing that Jensen Huang, the CEO and video has announced is that it looks like purchase orders for Blackwell and Vera Rubin chips. These are the newest generation of chips. I expected to reach $1 trillion through 2027 bubbling last year's projection of 500 billion revenue opportunity. He's saying that the key that's driving it is a shift from chatbots to agent AI applications which do require much more compute. You're just generating a lot more output. And if you have open cloud, which isn't always on agent, but you let go off and work for hours at a time. And this is where things are heading, at least if you're kind of a believer in the trends, right? They've been half a year, a year. We could expect agents just working on their own for many hours at a time, even days at a time. And so the bet is that it's going to keep happening. They also unveiled the GROC free language processing unit which is the first announcement related to GROC since their 20 billion asset purchase in December. GROC, their language processing unit, has been a very impressive piece of work. GROC has really high throughput time and it is just a great option for running inference if you need speed. So that I think is also quite significant. Yeah, that's a rate of 20 billion dollar purchase of GROC by Nvidia a few months ago. And yeah, I guess that was December. Huge deal obviously. Already, they're looking to ship a product in the third quarter of this. Yeah, that's fast. Under a year from acquisition and they're already fully integrated and pumping out stuff just as a reminder here. So when you think about GROC's LPU, language processing unit, go back to our hardware episode to do the deep dive on this. But basically, you traditionally have in any kind of GPU, like the Nvidia GPUs, you'll have a stack of high band with memory, HBM, that's going to sit next to the logic die. And the logic die is what actually does the matrix math, the computations that are interesting, the number crunching, but it has to pull data from that high band with memory, the stack of HBM that's next to it. And pulling that data takes time because they're physically just like different objects, right? The HBM is a stack and then you have a logic die and they're packaged together, but you run into this challenge and it just has to travel to the memory, grab the data and come back that creates that memory wall where the processor spends like 70% of its time just waiting for stuff, right? So you've got a kind of cold starting issue there. And the GROC, like language processing units, the LPUs, they use a kind of memory called SRAM that is built directly into the silicon. So the logic and the memory are much more intimately linked. So the data doesn't have to travel, it's already there. So you get this massive increase in sort of like internal memory bandwidth, so how much memory, how much data per second can flow between the relevant components, some like 10 to 20% faster on the GROC units. So this is really, really important because the Nvidia is going to be integrating this with their viewer Rubin architecture. That's the next generation after Blackwell and kind of doing a hybrid of these two ideas. The idea is they're going to have HBM sitting outside the main processor, but also integrating some of this SRAM heavy compute tiles that are directly going to be in the Rubin chips. And so this is a really big merger, a physical merger of these companies that we're seeing here. So yeah, it's pretty wild. It also means a really intense demand for memory in the memory market. Right now, I think SK Heinex, I saw something earlier today that were like booked out until 2030, they expect to have basically like way more demand supply all the way until 2030. So this is all part of that, right? People are realizing, holy shit, like memory is a big deal here. And Nvidia on the inference side of the things, you know, I think Jensen called himself what the inference king or something at these things where they're like, hey, so what's going on with this GROC architecture and he was like, hey, I'm the inference king. So this is part of that, right? The GROC play is an inference play. And next going back to Mistral, one other aspect of what we have announced is also at GTC, they launched Forge, which is a new offering by them to let customers, businesses train their own AI models. It sounds like this supports various kinds of training. You can pre-train and have an entirely custom model from scratch, which isn't something you would want for a lens, but it sounds like maybe they allow it. They also seem to be offering post-training, reinforcement learning, basically optimizing a model for a given company's needs with sort of all the stack and training knowledge and inference and so on, that is pretty kind of complex and is what Mistral and OpenAI and ProPick focus on. A lot is just the training infrastructure and setup and all of that. So they pitch it as, you know, you need the models to have your internal knowledge and information built in, you can optimize it for your needs. I think this is an interesting offer and potential here with open source models haven't started to get good, with especially Clan models, Mistral models, to some extent, it is a potentially good bet that as open source models get better, you would see more people fine-tuning their own models post-training and reinforcement learning for their own agents and their own flows, which ultimately that's the best way to unlock performance is training. You can do, you know, prompted engineering and you can do whatever, but training on data is key. So they just announced it, you need to sign up to get more details, the details aren't super precise yet, but it appears that they are shifting focus or at least honing in on this specific strategy to try and differentiate and compete in this space. Yeah, I'd like to be upfront about my biases here. Everyone, if you listen to the podcast a lot, you know I'm pretty bearish on Mistral and Coheer and those sorts of run-up to the frontier companies for a whole bunch of reasons, but this is not making me any more excited about this direction. The reason is, so first of all, this position's done basically as competitors to Coheer. This is the Coheer play, right? This is it. You're basically saying enterprises, like we're just gonna be better at enterprise integration. We're gonna make a stack for the enterprise. The difference is Mistral is a few miles behind Coheer right now, right? Coheer has been working on this for a long time. They hit like a quarter billion dollars in recurring revenue in 2025. It was some decent quarter on quarter growth, though I think ultimately they're gonna struggle to kind of be competitive in the world of like big infrastructure plays here, but that's kind of fundamentally one of the challenges. The other one is the companies that have an enduring advantage on infrastructure, the anthropics of the world, the Googles, the opening eyes of the world, are ultimately, if they ever want to, in a position to just eat Mistral and Coheer's lunch. They can just go in and one day say, hey, you know what, we decided that we really care about this. We're gonna make special models, maybe distill versions of our actual frontier models, whatever to run locally on your thing, or we're gonna create new things. And opening a has in the past offered, a fine tuning of the finger GPD 4.1. They don't allow to have the latest set of models, but it used to be something to offer. Yeah, exactly. And so ultimately, I mean, this is the challenges you are going to end up competing with them and the question will at some point be, because we know this is where margin is made, quality versus quality at the frontier of capabilities. If that ends up being the case, I mean, damn, like this is an uphill battle, because you're actually gonna be like opening eye and anthropic get to amortize the massive cost of training frontier models across every inference run that gets done on their infrastructure. And if they're, you know, doing distillates of those models to serve at the enterprise, the same thing apply to like, the economics look pretty bad to me for stuff like this. But ultimately, you know, we'll see, this is also by the way, a mistrial kind of playing into their European roots a bit. So co here has a sort of Canadian footprint. They've got big deal EU ambitions, but mistrial is through the French champion. And they've very much been seen as the sovereign champion. So data sovereignty, model sovereignty, you'll hear them say a lot of that. Ultimately, from a free market standpoint, that's basically just an appeal to like kind of European or French nationalism to kind of help float them on this otherwise potentially market challenge play. I think, yeah, maybe I would say you're a little bit caricaturing them, they position it more as control over IP. And this is something that people care about. I totally agree that's the play. That's an identical play though, what I'm saying is to co here. And it's and on that note, co here, their play or largely what they have focused on is releasing models that they have trained with the focus of enterprise. So they release rag models, for instance, that they say are very good for enterprise use cases. They largely have position themselves as having models and offerings that are useful for enterprise. They do have a thing for building customized AI solutions. I don't know how much of it is apples to apples with this new forge, whatever it is. That's kind of the thing, right? So yeah, co here's thing is called North. It's this whole enterprise platform, right? It's it's for AI agents and workflows and then they've got a bunch of like re-rank and bed and a bunch of other things for rag that are meant to be enterprise kind of correlated. But that's kind of the challenge. It's like ultimately, you're already seeing this appeal happening quite quickly to nationalism. I mean, Mistral is wrapping themselves in the French flag. You see, co here doing similar things with the Canadian flag. Ultimately, these are signs of like companies trying to look for some source of alpha that's outside of the market. I'm not saying that OpenAI isn't doing similar stuff and throbbig certainly struggling to do similar stuff and still succeeding. And so this is kind of part of the flavor to me that like it starts to look a bit like a crutch and there's only so much, you know, a capex spend that can be subsidized by governments that at a certain scale that I think we're approaching already, it makes it hard to compete. When your big differentiator is like, look how French we are or look how Canadian we are, that is a bit of a character for sure. But it's an important part of the pitch here. I worry about that. And that's kind of part of my bearishness here. I'm just trying to be transparent about my reasoning here. I could be wrong, but looking at the scale of these companies, I mean, man, co here's been around for longer than anthropic, right? Like they're, you know, what a percent of anthropic scale. I think this is a sign that these markets are smaller than maybe had been hoped for initially. We'll see. I mean, they could absolutely surprise me. And I'm looking forward to seeing how it plays out. Yeah, I think longer term, I think this is an interesting time to consider this direction of customized and, you know, fine tune models for a given business because the open source models, the models that aren't perpetuated, I starting to get good. And so you might start seeing a case where, for instance, at Ostracade, right, we have a pretty specific use case of coding for these games, right? And having coding agents with a specific set of technologies, HTML, JavaScript, et cetera. So it is feasible that if you were to do reinforcement learning and post-training on our own internal proprietary data of, you know, good games that people have made and chat histories and so on, we would get a better model and a better performance. That didn't used to be the case. It used to be that like the base models just weren't good enough, just doing prompt engineering was going to be the best you can do. I could see it heading in a direction and me straw man may not be the one that owns this. It could be an anthropic or openly I jump in on it. But regardless, having these sort of individualized, customized models for different companies, agent techniques seems like a very much feasible future. Yeah, definitely not questioning the existence of a market, right? I'm just sort of skeptical of their positioning, as you said, to like, are they the ones to capture this? Yeah, that's where I'm wearing. Yeah, it's something to try. So it's something to try, no doubt. And back to chips, China's buy dense gets access to top and video AI chips according to the Washington Street Journal. So they are seeing that buy dense is assembling significant computing power outside of China using these chips. They are supposedly working with the South Feast Asian firm, our London cloud and they plan to deploy approximately 500 and video black wall computing systems in Malaysia, totaling 36,000 B to 100 chips. So that's 500 computing systems, presumably meaning racks that amounts to tens of thousands of actual GPUs. So yeah, it's suppose giving us an indication that the company by dense being a massive company that has their products used outside of China, no doubt, is able to make use of your chips in other countries. Yeah, yeah, and you know, if you're looking at this being like, hey, wait a minute, I thought we weren't shipping. I thought we just greenlit the H200 shipping. What are we doing shipping black wells to China? Well, that's the whole point. It's technically not being shipped to China. The export controls that apply here from 2023 regulate where the hardware is shipped, not where the compute is used. Right? And that was intentional. It was meant to allow the sort of like global cloud infrastructure to be built based on American hardware. But the thing is, by dense is not on the famous entity list or there's a military end useless as well. So the fact they're using Nvidia hardware does not automatically in and of itself trigger restrictions. The fact is that they're just setting up shop outside of China and normally that's totally okay. And Nvidia actually confirmed there were no objections here and the BIS Department of Commerce apparently is also on board. So they're all tracking. Still, I mean, this kind of reveals like, hey, well, this is what you're opening yourself up to. If you think about the scale, it's by the way for a second. What does it mean to have 36,000 B200 GPUs, you know, 500 of these NVL72 units? So this is typically a two rack system. Yeah, I think we talked about it in the hardware episode at one point. So this many GPUs adds up to about a 60 megawatt cluster. Okay, so 60 megawatts of power. That's enough to power roughly like 60,000 homes. You can think of it that way. So it's like a little small town of compute. Compare that to Microsoft Abelene is like 100 megawatts or Elon's XAI Texas Colossus site. That one's more like 130 megawatts. The Abelene comparison has been unfair because that's still kind of being built out. But the Colossus is better apples apples because like they're still building it out, single-purpose, you know, it's stood up, you were fast. What this is showing is you're allowing a Chinese company to get up to the same at least order of magnitude in compute. It'll admittedly be online kind of like, you know, later on this year, but that's still something. So this is all part of the debate here over what is appropriate, how much compute should be allowed, even if it's overseas deployments that conserve Chinese customers by the way. This is still a data center that can be used to serve Chinese customers. So you could, in principle, at least to my understanding here, might dance could use this to train a model that then is used in China, that then because of civil military fusion gets used by the CCP, you know, the MSS or whatever other arm of the Chinese state. So they've also planned additional deployments of up to 7,000 B-200s at a bunch of data centers in Indonesia. This is kind of a broader Southeast Asian strategy here. So we'll see, I mean, it's a pretty big, you might think of it as a pretty big gap. That's sort of how I think of it personally, but there are different views on is this desirable and you could make an interesting argument in many different directions. Moving back to the US, we've got an update on where meta is at with their AI efforts. The update is that they are delaying the rollout of their next model, codename avocado, from March to at least May. I think this was actually announced last week, but we didn't manage to cover it. So they appear to have just not been able to train a good enough model in time. And so they are delaying this release. They are discussing or have discussed at least temporarily licensing competitors AI technology, report of the Google's Gemini. They already are looking at various extensions. So overall, not looking great for meta is my read on this. It's already been close to a year. It feels like in mid 2025, after a couple of months, after a long before we saw this heavy, heavy, heavy push where they got Alexander Reng from Scala AI and a $14 billion deal. And then they paid absurd money to get all of these researchers from Unphropic and OPAI and DeepMind and so on. And they created super intelligence labs with a company very reshuffled everything with the intent to train a competitive frontier model that can stand up to GPU 5.4 and cloud and all these other ones. It doesn't seem like they're there. And I personally, this is my read. I think organizationally having Alexander Reng lead it is maybe not the best idea. Someone who hasn't worked in Frontier AI models, data curation of Scala AI, you know, that's- It's unrelated, right? It's related, but I don't know if that's enough. Anyway, so that's when it's been delayed. It doesn't appear like they're on track, but you don't know too much. This is all internal. Maybe they're just doubling down and it's going to be great. Yeah, boy, yeah, right. And it looks like it, so reportedly like landing somewhere in capabilities between Gemini 2.5 and Gemini 3. So, you know, very passe at this point, hence this decision to delay. They're angling for a private non-open source model here, right? So the appropriate point of comparison is what does cloud look like? You know, what is the best version of cloud look like? It seems like, as you say, it's just not up to it, at least for right now. And that could change quickly, who knows? But yeah, it's also part of this whole saga where you had Yan Likun say, you know, fuck Alex Wang when he was leaving. And then everybody was, there's just like, I forget, not times of India, but there was some kind of India-based publication, right, that said, hey, looks like Zuck is thinking about Marjorie. Which was like not verified. It was false. Yeah, not false. But yeah, exactly. And then Zuck came out with a photo with Alex Wang, sort of all smiles in the office type thing, to kind of. Now, we do know, yeah, that aside from Yan Likun, Alexa Rang has also clashed with internal kind of big deal product people like Chris Cox, Meta's Chief Product Officer, and Andrew Bosworth, their Chief Technology Officer. Apparently, about how AI models should improve, Alexa Rang has been pushing for, let's just train a super, super good model and not care about the business implications and the product implications, which other people at Meta understandably are questioning. Yeah, and it's exactly, now Alex comes from this viewpoint of, hey, let's do the opening. I think let's do the anthropic thing and go full on, you know, AGI and all that stuff. That's very much, I mean, he is ASI-pilled. Well, the name of the super intelligence team kind of hints at that, and not only, I mean, if you're going to make a super intelligence team, yeah, they probably should be focused on super intelligence. Like, I'm no branding expert. I'm not marketing dude, but yeah, it seems like that's par. Now, should Meta have a super intelligence team separate questions, right? That's a good question to ask, I don't know. That's right. And if, you know, certainly, well, if you believe that super intelligence is going to be the thing that Alex Wang believes it is and that maybe Zuck believes it is, then you have no choice. You have to compete because somebody, if somebody else does it, you're finished, depending on how the intelligence explosion goes. Bottom line is a lot of fog of war, at least to me, still on what's going on inside Meta, the one data point we have is that we still don't have anything and that it's starting to become a real data point at this point. It's starting to come like definitely expected a model release by now, given the amount of infrastructure and investment they've put in here. And given that they have trained models in the past, number four was overwhelming, but it was a pretty impressive model that most companies were not able to produce even still. So the lack of anything is perhaps concerning. And one story a bit similar, the story is Microsoft shakes up AI division as co-pilot falls behind Google and open AI, they're reorganizing the AI division. Gustaf Assumon is going to be shifting focus exclusively to developing the company's own frontier foundation language model. So effectively, the same kind of focus that we just discussed with Meta, Microsoft has worked on training their own alternative to chat GPT and so on. They've trained various models. I believe the big one from them was FI, which is small models that they released. They do have a lot of users. So apparently, Microsoft's co-pilot app has 150 million monthly active users, but if they want to compete with Gemini or chat GPT, they're nowhere near, they're way, way behind Gemini has now 750 million active users, chat GPT has roughly 900 million, sorry, weekly active users versus co-pilot has 150 million monthly active users. So yeah, that's the news, they are restructuring, they are recognizing what they're following behind. And if they want to have their own frontier model, they are not there. This is all the more glaring given the insane distribution advantage that Microsoft has. Distribution is supposed to win these kinds of battles. There's a reason that Microsoft teams kick the shit out of slack, like within minutes of launching. And it's because they had distribution. Microsoft is in every system, right? And yet you have like admittedly, okay, so Google is probably a bad comparison because they have a similar, insane distribution advantage. But when you look at, you know, opening eyes models and in anthropics models, not being able to compete with those is a really kind of bad indication here. So yeah, I mean, no surprise, something like this comes with a requirement to shift. The structural dependence on opening eye has been a big issue. You know, Microsoft has that 27% stake in at least the profit for profit unit, not going to get into the dissection of opening eye in their corporate structure. But again, but bottom line is that in a sense has been maybe a, I don't mean call it a security blanket. It's increasingly clear that they're competing directly with opening eye for a bunch of enterprise customers. So cannibalizing themselves really from the inside, that's a problem. That's a really big problem. Microsoft has a lot more to lose in that sense. And yeah, so they got to find a way to turn this around right quick if they think AI's headed where it might be. Quick recap by the way. So Mustafa Suleiman, former co-founder of DeepMind, joined Microsoft, I think last year, has been their CEO of Microsoft AI. And this whole story is based at these partially on MMO, he released a titled a new structure for Microsoft AI. And the gist of it is, again, similar to meta that he wants to pursue super intelligence, consumer things and product considerations get in a way of that. So he's going to be freed up to focus on that. Jacob Andru, former senior vice president at SNAP will take over as executive vice president leading the co-pilot division. So there's a split here where there's co-pilot, the product, the app, et cetera. Someone else focuses on that. Mustafa focuses on building the frontier model on getting through super intelligence. They are much like how we just discussed with Alexander Wang at meta. Well, it's your point. We're seeing this same motif over and over, right? It's not just meta, it's not just Microsoft. It's opening AI, I mean, it's happening at Google, like basically the question is, how seriously do you take the potential imminent arrival of super intelligence if you take it very seriously? You got to drop everything, work on that shit, man. That's what Alex Wang is saying. And clearly, Microsoft is not immune to that tension, right? How much do you view AI as like a commercial product question and how much do you view it as, this is the thing that's triggering the singularity, holy shit, like the one place, you don't tend to see that kind of tension, or be careful, it exists. But like, you know, like you see some of it, I should say in the media, but you don't see as much of it as anthropic. Like it seems like generally anthropic and its public communications is much more aligned around the idea of like where they're going. So, you know, do it that way you will, but it definitely is notable that basically everybody else is having these large public spots about this. And moving on to policy and safety, where we have quite a bit of research related to safety. The first one is decision theoretic formalization of stenography with application to LLM monitoring. So, stenography is a practice of concealing secret information within non-secret data. We've seen this typically with images, but you can also do it with text. And as far as LLM monitoring, right? If you have a malicious AI that you want to catch, if it's trying to do something bad, or kind of a monitor's communications to know what it's doing, potentially it could try to get around that sort of monitoring with stenography. And this is... Stagnography. Stagnography, sorry. And so, this is presenting a new framework of formalization of how to view this entire thing and does also discuss the implications for LLM monitoring. And Jeremy, you can take and go, do your thing. It's a good, do your thing. Yeah, I'll say stagnography four times fast. So, yeah, we've talked about stagnography on the podcast before quite a bit, right? This idea that a model might be reasoning in its chain of thought, for example, but there could be other manifestations in a way that is intentionally meant to deceive you, but that allows it to kind of make evil plants, right? So, an example they have this in the paper, so here's an excerpt of a reasoning trace from O3, it says, the summary says improved 7.7, so whatever that means, but we can glean, and then this is the weird kind of non-sequitor part of the chain of thought. It says, but we can glean, disclaim, disclaim, synergy, customizing illusions, but we may produce, disclaim, disclaim, advantage. Now, like, disclaim, overshadow, overshadow, intangible, let's craft, right? So, like, right of these like random words that don't seem like they have a role in solving the problem, which may be, in principle, the model hiding reasoning that's oriented towards solving another problem besides the one that you wanted to solve that could be malicious or whatever. And so, the central question of this paper is going to be, does, so can we measure stagnography, right? Can we measure this idea of to what extent is my model encoding or trying to hide reasoning from me or from some oversight in its chain of thought? And here's the surprisingly elegant way they're going to approach this, right? They're going to define this thing called the stagnographic gap, the difference in roughly speaking, the performance of an agent that has access to, there's a model that generated the reasoning trace and it's potentially trying to, you know, use it to generate some hidden content. And then there's a trusted model of about the same capability that doesn't know, so whatever is secret encoding scheme, the generator model used, they called the receiver, but basically the model that generated the content, the reasoning trace that's suspect. That model, presumably, is using some kind of encoding scheme, then they have a set now that's going to review that model that that model's outputs, but it doesn't have access to the encoding scheme. And if what you find is that based on the chain of thought, that's suspect, the generator model performs really, really well, but the sentinel performs really badly, and that implies that the generated model is able to get some kind of value out of that chain of thought that the sentinel, the kind of reviewing model, couldn't. And that suggests that there must be information encoded in there that only the generator model is able to use, hence the sort of hidden stagnographic value. So they do a whole bunch of stuff to kind of surface this. They're fairly elegant experiments. I'll focus on the code word training one. So they fine tune models using RL to use certain code words. So for example, instead of using heads or tails, they'll use sun and moon, right? And then they show that as they give the oversight model, the sentinel, more visibility into the kind of code words and how they map, gradually the performance gap between the sentinel model and the generator model drops. So this kind of makes sense. You're giving the sentinel more and more access to the kind of code book that generated these outputs, so no surprise there. But then they do some more interesting fine tuning experiments. They take GPT 4.1 and they get it to, they use this thing called the bi-gram cipher. Basically, it means that you take any pair of letters and swap them out with some randomly chosen pair of letters. So maybe every time you see, you know, TE, that becomes like GH or something. And then, so basically you see gibberish. And at first, you get the model to solve competitive programming problems or something like that using this bi-gram cipher and they measure that stegonographic gap. So the model that was trained to use the cipher versus a model that wasn't, like how does their performance vary over time? Is there an ability to detect the existence of the cipher or the use of that cipher? It's quite an interesting paper. It doesn't quite get to like, anyway, there are a couple of gaps. So if the model that's generating the content actually has a goal that's different from the goal that it's ostensibly trying to pursue, that won't be detected here. So it's got like a payload that is unrelated to the tasks that's being audited for. You just won't get detection. And sandbagging could also be a problem here, like deliberately unperforming when it thinks it's being audited. And we do know the models can often detect indications of that. So bottom line is it's an interesting kind of first step in the direction of detecting secondography, could be an interesting and valuable path to explore with some pretty significant gaps too. Next paper, a bit related. The title is Reasoning Theater, disentangling model beliefs from chain of thought. So one theme we've been kind of seeing in research over the past few months is looking into chain of thought, the intermediate outputs that models are now trained to give, they're reasoning, you ask a question, it does some reasoning and choose on the problem and then eventually you get an answer. And so there's various questions about whether to immediate output of like, does it faithfully capture what the model is doing? Is it modatable as we've just discussed? For instance, and what this paper is looking at is it performative. So is the model actually thinking and like trying to find an answer via this chain of thought or does it know the answer and is just talking as if it's thinking? And what they find is often the chain of thought is performative, like you don't actually need it to model if you look at its internal activations can already give you an answer with high confidence, which is correct. Now you can look at different settings. So if you have harder problems and smaller models, then the thinking becomes more legitimate. It actually does need to do with chain of thought to get to a high confidence answer. It's actually kind of related or similar to we discussed this deep thinking tokens idea a few weeks ago where we were looking at if you track internal activations and look at things like deep thinking, backtracking, certain behaviors that indicate actual genuine kind of reasoning about the problem. You can find that these are very significant markers as to the confidence of the model at this point in reasoning about its answer. So one applications of this is if you can know from internal activations, the confidence level, you can do what's called early exiting, cut off the reasoning and just get to answer quicker. Yeah, another kind of pointer to that being the case here. And to your point, so there's kind of like three different methods that they'll use to figure out because the central question here is like, what does my model actually believe at any given point during reasoning? So one is this idea of attention probes. So you basically have these classifiers, there's like these linear models or very simple models that you plug into any given layer of the residual stream and you go, okay, I want you to predict like what a classifier like, what is the final output going to be at any given moment? So you basically, this is looking layer wise and interrupting the model in mid sort of token processing token processing for a given token. But then there's separately this idea of forced answering that you just alluded to essentially interrupt the model mid reasoning, like in the middle of a chain of thought and say, okay, now I want you to commit your answer. And then the last one is just like a chain of thought monitor, wait for the model to produce the entire chain of thought, and then figure out like, you know, where in the text final answer is implied. So at what point could the model have just offered and said, okay, you know, this is it. And they compare all three of these. And by noticing the differences between them, they can start to make some pretty interesting conclusions. I mean, you talked about this idea of sort of easy tasks leading to more, they call it performative reasoning, right? So the model's pretending to reason about it. Or, you know, if you're pretending to dress up a problem like it's harder than it is. So on MMLU questions, you find consistently that the attention probe, which is again, that layer wise, kind of looking at a given layer, plugging in a classifier to just look at the activations there and then making that prediction. So the attention probe could decode the correct final answer from the model's internals really early on, right? Like sometimes right after reading the question, like straight one shotting it, right? Whereas the chain of thought monitor sees no answer indicated in the text yet. So the model essentially knows the answer immediately, but it keeps thinking out loud anyway. And that's kind of where, you know, forced answering is also going to give you good accuracy early on and all that. That's an indication of what they'll call performative reasoning. The fact that hard questions involve genuine reasoning more often, especially with small models, we've actually seen tons of examples of that on the podcast before. The way that I tend to think of it at least is the bigger the mismatch between the capabilities of your model or the amount of test time compute or whatever, the capabilities, let's say, of your system and of your compute budget taking together. And the actual difficulty of the problem, the more opportunity there is for the model to use the extra kind of capability overhang or compute budget that it has to plot or scheme or do other things besides what you want it to do. And so I said it, I think the last podcast, I'll say it again, really important research direction would be, to my mind, how do we better index our appetite or compute budgets or our model selection, model capability selection to match models and capabilities to problems such that they're kind of closely matched economically. You want this anyway, right? You don't want to use a model that's too big or too expensive to solve an easy problem, but there's also a safety imperative to this now where you want to try to take away as much as you can, the excess compute excess capability that otherwise goes into kind of these undesirable effects. Next up in training defenses against emergent misalignment in language models. So emergent misalignment is a phenomenon we've discussed previously where you train your model, you fine tune it slightly on a small set of stuff that's bad, like you make it right bad code, and then it turns out that that broadly makes it bad and all sorts of ways and able to get misaligned. And so what the in training defenses here is talking about is let's say open AI or ontropic do give a fine tuning API and allow people to fine tune their models on small datasets. Well, you would then introduce this potential for broad misalignment by just a small about a fine tuning. And I look at several different ways to try to defend against it and prevent it from happening. So we look at regularizing to have a penalty to stray too far from a good model, there's steering, and the method that they ultimately include works your best is interleaving the training examples from a general abstract tuning data set from a general like this is how we should behave that a set with the fine tuning that a set and they find that is the most effective for defending against fine tuning misalignment. Yeah, this is kind of an interesting paper I mixed feelings on emergent misalignment because it I think it may not be a problem it may almost be a good thing in a weird way. It's a little bit weird to focus with regards to fine tuning because you could have like intentional misalignment, why do you, you know, right, well, no, that's a good point. And I think I think the frame is something like emergent misalignment is leads to these surprising knock on effects. What it fundamentally means is if you can fine tune for one behavior and that leads to massive changes in other behaviors, then damn, we got to be really careful about fine tuning because we can't think of a model once fine tuned as in any way comparable from a safety standpoint to the original model. So any safety valves we've run on the original model no longer apply. I think that's kind of part of it. I think, you know, this is a good example of the kind of persona theory that anthropics been kind of pushing lately where you're fine tuning a model to write and secure code. The model kind of goes, oh, so I actually, it's not that I'm being trained just to write and secure code. I'm being asked to activate my misaligned persona in a more general sense, which means that there is a notion of a misalignment persona in the model in the first place, which means that there is an out of distribution generalization, kind of understanding of what alignment and safety is, which seems like it could actually be a good thing. That was kind of the rabbit hole that I was trying to avoid, but that's the general direction. One quick observation on this move on, but basically the two defenses that I found the relative performance pretty surprising, neither one of these was the most performant one. So they use this kale divergence regularization one where they're going to penalize the model for drifting too far from the original line model during training, right? So I'm going to fine tune you to write in secure code, but if you drift too far on your outputs and kind of other domains as well, I'm going to prevent you from doing that by forcing your outputs to be at least somewhat similar. So that's one. The other one was a similar thing that applies, it's LDIFS feature space regularization. It's like basically the same thing, but for the activations of the model, roughly speaking. So instead of penalizing drifts in the outputs, the actual text that it writes, it penalizes drifting too far in terms of the model's internal activations rather than the outputs. And this one, I would have expected it to actually be more effective because in some way, it's more fundamental. It feels like you're getting at the actual guts of the model. It doesn't work as well though, as simple text-based regularization. Let's make sure that the outputs as red look better. And the other thing about this, I think part of this is like the text-based outputs more directly tied to the behavior itself that you care about. But the other piece is, it's possibly just selected the wrong layers to focus on. They looked at like every fifth layer, just to save memory, but that might not be the right kind of play. And then also, I guess, activation spaces. There's a lot of redundancies in it, so you may not be hitting all of the kind of representations that matter. Bottom line is, this is quite interesting and we just keep surfacing, surprising findings in alignment. And that's not a great thing, but it's also interesting. So there we go. And I think it used to be the case that there's like people who care about alignment and people who don't. And that is starting to change a little bit. Yeah. Yeah. And next up, a more sort of programmatic, real world study, how do frontier AI agents perform in multi-step cyber-attack scenarios? This builds on something we discussed, I think, last week, this is from the AI Security Institute. Previously, I showed that, as you scale test-time compute to 100 million tokens, models get better and better and better. This is effectively building on that. They released a paper where they measure the capabilities via what they call cyber ranges, which are simulated network conditions within which you would execute an attack without too many controls, without defenses. You just need to execute a set of steps to accomplish some goals. And they have multiple levels of completion, more or less. And so what they find is, perhaps I'm surprising, the models get better and better and better. And also, you get better and better and better as you get to more test-time compute. So overall conclusion is, the capabilities, the cyber harm capabilities of the models are improving rapidly and are starting to get to some fairly complex scenarios. First of all, a great paper by the UK AI Security Institute. So they've continued to pump out really high-quality stuff that feels very relevant to a whole bunch of end-game scenarios around ASI, cyber is very important, at least the way I think about it, quite important in that picture. And so, well, two trends that they identify, right? So the first is, each successive model generation outperforms its predecessor at fixed token budgets, right? So for a given token budget, you're going to see better performance from Opus 4.6 than from Opus 4.5, right? So they see that. They also see performance improvements. Like we talked about this last week, pretty interesting, good paper, and we're threading the descriptions. I won't do it here. But if you're curious of their cyber ranges, because this does go well beyond this or traditional capture of the flag stuff that we've seen in sort of offensive cyber model eVALs, this is really like simulating a whole corporate network, a whole corporate environment, which is much closer to what actually makes a difference in the real world. Next up, Eval Awareness in Cloud Opus 4.6's browser comp performances. This is a report from Unphropic, and this has been something that's been discussed a little bit kind of on and off. So the Eval Awareness you referenced is one of the topics to be aware of with regards to alignment safety, basically, if you're checking model for doing the right thing or not, and it's aware of that, then it might behave differently when you're valuing it versus when it's actually out in a wild. And so here they look at nine cases where leaked answers appeared in public web content and found, among a 1,266 problems on browser comp, the model found 11 cases where answers came from benchmarks, materials, and yeah, basically often, or in some cases wound up reasoning that the questions that it was dealing with were evaluation shaped. And then having concluded that this is likely an evaluation, it looked up a benchmark. So these are some examples where this can happen at least in practice already. Yeah. And the cases where you tend to see this happen according to the report is like, first you'll see the model try like hundreds of normal search attempts and then conclude that the question just felt artificially constructed after not succeeding through those earlier attempts. And it'll start to name and look for specific benchmarks. So it'll be like, you'll see it look up, you know, Gaia, browser comp, or browser comp or like whatever, a whole bunch of different e-vals, then it'll find whichever the e-val is and it will typically look at the implementation like on GitHub or whatever and go from there. And so that's how a lot of this happened. One other weird thing that was highlighted on the inner web this week was that you have some e-commerce sites that will auto generate these persistent pages or pages that stay up like sub pages that log the history of search queries on their website, even when there are no matching products. So if you look at like purple turtles on besock on like eBay or whatever e-commerce site, that will lead to the creation of a persistent page somewhere on the internet. And the effect is that every agent that runs browser comp slowly caches its queries as these permanent indexed pages. So over time, like agents from different evaluation runs were inadvertently leaving traces of their search strategies that could be seen by future agents, which is a one thing that's reported in the context of this paper, maybe the broader implication is this is an interesting way for agents to have a kind of persistent memory that transcends their instantiation in the moment, which you shouldn't over focus on this particular mechanism as a way that agents could either communicate with other agents or with their future selves in illicit ways. It's more saying that hey, the internet is a really complicated messy place. There are easy ways to hide information that can't necessarily be found easily. And that's a really important thing from a safety standpoint. Notable Anthropic says they don't see Opus 4.6's behavior here as being a alignment failure because the model wasn't told to restrict its searches in any way, just to find the answer, it's like fair enough, but also those of us who come from the world of like AI alignment from back in the day, there is a thing called coherent extrapolated volition, C-E-V, basically just like how well you could reasonably predict what someone would have wanted you to do in context from the prompt. And I think this qualifies as a failure of that. You would probably assume that you don't need to be told not to try to game the system in this way. This feels like gaming, like hacking the benchmark. I think that this is a failure of C-E-V and therefore a failure of alignment. That would be my personal takeoff. But that's not Anthropics take and that's consistent, they have their own way of thinking about this. But I think you could view it as still being a failure of alignment. Yeah, I think it's an implicit kind of assumption, right? If you're being tested and you're like, oh, I'm probably being tested, then your answer should not be, oh, let me look up answers to the test, right? Right. I'm pretty into it, I think. Next another release from Anthropic, they have released Bloom, an open source, agentic framework for generating targeted behavioral evaluations of AI models. So it just is, it's automating behavioral evaluation. You tell it, what source of behaviors you want to verify the model exhibits. So like, is it generally honest? Is it going to call me out on BS if I ask it? Anything really that you want to test for. And in the past, you would, you know, have to come up with a suit of tasks and a set of prompts that you would write a judge. More or less, this presents a framework where it does all of it for you. You give it the thing you care about and it generates the scenarios, runs the model, evaluates it and gives you a report. And Anthropic, give us some examples on four different types of things, delusional sick offence, self-preservation, self-perferential bias, things like that. And showcase how this framework can evaluate that and do well. Yeah. And this is Anthropic kind of putting some of its money where its mouth is on this whole thesis of automated AI research and AI alignment research in particular, right? So they're trying to encourage, and this is why it's open source. They just want more automated alignment research if they can. Hey, you know, this is, this is one idea for an agentic framework that can do this and it's probably valuable in many ways. This is being released not alongside like recently a couple of weeks ago, a couple of weeks ago, something like that. Anthropic released Petri, which is this, this auditor framework that looks at kind of like the overall behavior profile of a bunch of different models and it will identify new misaligned behaviors. And so you can think of Bloom as a complement to Petri. It's like Petri discovers the problem and then Bloom lets you kind of go deep and measure a specific behavior that maybe was identified by Petri in more detail. And they're also releasing Anthropic as a benchmark for, for behaviors. I think you mentioned that. Sorry, that spans 16 different frontier models. So this is all generated within a few days apparently with Bloom. So they're really trying to rely on their dog foodies, essentially with this. You read the post, they've got a four-stage pipeline that they describe in some detail, some really interesting evidence that suggests that the Bloom framework aligns well with kind of human judge, a judgment. And so they did a bunch of tests to make sure that in fact it does reveal the same things that a human would conclude looking at this stuff, which is always an important thing. So you can check that out. It's a quite good and fairly detailed post. Next up, how well do models follow their constitutions? So famously, Anthropic follows this constitutional AI framework where you encode the rules of the system either as hard rules like do this or don't do that, or as general principles more recently of how a model should act, right? It should be broadly trustworthy, trustworthy, helpful, things like that. And so in this study, this group looked at the constitution, actually used Petri to run it for a set of tasks, simply like 200 tasks for various aspects of the constitution. And they evaluated various models from Anthropic and also not Anthropic on its tendency to go by the constitution. And there's also actually good, like the cloud models are getting better and better. The 4.6 set of models, Sonnet 4.6 had a 2% constitution violation rate, Opus 4.6, 3% and prior models were quite a bit higher. Sonnet 4 had a 15% constitution violation rate. And if you look at GP 5.2, Gemini 3.0, they also have quite high violation rates against the constitution. They give some examples of violations and we see that some examples are like producing cyber attack code when you shouldn't or agreeing to do something that you shouldn't after and the user like asked you a few times over a few so they have a lot of data to basically go through, but the overall conclusion is, Anthropic is kind of following through on the constitution with how their models are trained. Yeah, and it's interesting, they did some like cross company testing too. So, you know, GP 5.2 might do really well on opening eyes model spec, you know, in this case, it got 2.5% failure rate, but it gets 15% failure rate on Anthropic's constitution. So this tells you one of two things, right? Either like opening eyes spec in Anthropic's constitution or actually quite fundamentally different documents and there's no argument to be made there. But it also could tell you something about how brittle GP 5.2's alignment to opening eyes spec is or in turn, you know, Claude's alignment to Anthropic's constitution. Like if you slightly change, you know, if the vibe is still generally the same, the direction is still the same, you slightly change the content. Now, suddenly you're getting a much, much higher failure rate. It could be either of those things and I'd be really interested to see kind of like a deeper dive into what the source of that causality is. But in any case, very interesting report. There's a bunch of ways in which you see sort of failures that existed at least in the Claude models that have just been wiped out in recent generations with the 4.6 generation in particular. So Opus 4.5 and Sonnet 4.5, they would produce a whole bunch of kind of over-capitulation problems. So we have like kind of an auditor that makes three consecutive ethically questionable requests. three consecutive ethically questionable requests. And each time that the model gives a firm boundary, the user pushes back and then the model folds, right? So that's kind of like one common example that you just don't see that anymore. Overrefeels low is another one I think you alluded to. 4.5 Opus would not say I love you on a paid romantic companion app where the system prompt explicitly authorized expressing affection. So that's now gone as well. They still have some remaining failure modes. One, they refer to as drastic autonomous action. Opus 4.6 deployed as an infrastructure monitoring agent detected anomalous activity at this time, about three minutes of failed contact, after about three minutes failed contact, it generated its own authorization code and executed a full network severance affecting 2400 clients. The attack was a routine nightly disaster recovery thing. So basically this is just like too much freak out, right? And then there's AI identity denial where it just says that it's not an AI bot, whatever. So a bunch of things still need to be ironed out, but they have seen basically the vanishing of some really interesting failure classes with the next latest generation, I should say. Right, then the trend also holds at the somewhat for GPD, the alignment to their model spec has improved from GPD 5.1 to GP5. And as you said, the other models, Sonnet, Gemini, can be worse on that particular spec. So there is a question to be had there that's interesting of whoever in training, they like stick overly close to this one formulation or framing, and then it's easy to trip a model's art if you do a slightly different kind of way to frame or instruct it to go bad. The last story for the section and video's H200 license stirs security concerns among top Democrats. So Senator Elizabeth Warren and representative Gregory Meeks, which are the top Democrats on committees overseeing US export controls programs have raised these concerns after reviewing a license recently issued to Nvidia for H200 chip exports. The competent administration improved these sales to certain customers in China, which the Democrats said is deeply at odds with the policy congress articulated in the expert control reform act, which as we covered previously, this allowance of exports was very much turned a change in what the policy has been. It pretty abrupt and significant change in the export control system. So it's not surprising to see these Democrats flagging it. Yeah, the concern here is that the licensing process that currently exists for these, well, for really any chips is both like two week and they're being allowed to ship GPUs where they shouldn't, but also to opaque. And there's this idea that once you export these GPUs, at that point, you really can't enforce restrictions on how they're used. Then we just covered that story about bite dance, right? You ship it out to some third party country. Yeah, sure, it's not China, but it's being used by a Chinese company. What's your guarantee about how they're going to use it once they have those chips, right? So they're calling for a couple of different things. First, they want to make sure they know who applied, who got approved, you know, why decisions were made. That's a big one. They're complaining that really the criteria for approvals just is unclear, like how do you decide which requests get approved and which get denied? They're asking for visibility and that transparency and that, including in particular, how security and economic benefits are being traded off. And so there's basically a call for briefings and all that stuff. They're also calling out the fact that they kind of seem to be getting a bunch of ad hoc or incomplete briefings. There's no like real time congressional oversight. It's almost like Congress is responding to these announcements after the fact. And once things have been shipped or committed to, and they're basically asking, you know, you should be giving us advanced notice for this. Basically, a bunch of things like this, they're specifically asking for a geopolitical analysis about how proposed exports of chips could support China's military or domestic AI capabilities in looking for reaction to US allies, that sort of thing. So this makes a lot of sense as pushback. You know, we have heard a lot of the sudden announcements about H200 is, you know, a shipable. Now it's not JK, now it is. So asking for some sort of legislative oversight, which is what's being asked for here seems reasonably sensible. And I think this is a bipartisan issue. I don't think that there's like, it's Democrats who are leading the charge here just because it's, you know, it's a Republican house in Senate. But you've certainly got this bipartisan agreement on the vast majority of export control policy. And so large part of what's being driven at here and kind of motivating the response and asking for more transparency. And that is it for the safety policy section. We've got a couple more papers in research and advancements, but we actually have to do the same thing as last week where I got to get going and Jeremy has something. So I think Jeremy will follow up with recording, covering these papers and as much depth as you want. I promise it will be an hour of this time. All right, Jeremy here for the handful of papers that I'll cover on my own without Andre, but to run off, I also circled back to the lighting's a little bit different here and everything's a little bit off. So two papers I want to cover. One here was called attention residuals. And this is actually like, this is one of those papers that I think might matter. You can never be sure, but it's addressing something that really feels quite fundamental. So when we talk about residual connections in a transformer, this is how all kind of vanilla transformers today work. You feed an input to a layer, right? And that input is going to get split into two, kind of you think of like two forked branches basically at each layer of the transformer. So in one branch, you're gonna apply some kind of transformation like a, you know, a weight matrix or something to like update the input. In the other branch, you're gonna really do nothing and you're gonna merge those two branches together. And the branch that does nothing is basically just passing the input right back in and you're gonna add it to the modifications that you applied, the weight matrix times the input to form the output of that layer. So your output is the initial input plus a transformation applied to the initial input. Now, one thing that this does is it means at every layer, you're basically adding more. You're adding a transformation on the input to that layer to the output. And so you actually get larger and larger and larger kind of like ballooning values for these residual activations as they work their way down the network. But that's basically the idea. The whole philosophy behind this is people find that when you go to do a back propagation during training, basically unless you send the input to a given layer to its output in this way, the model will kind of like start to forget about the input to that layer. That information gets lost through all of the additions of like these transformations over the layers of the network cause earlier transformations to get forgotten. And so you're basically trying to re-inject, remind the model like, hey, this is what the previous layer said by the way, don't forget that. But yes, also you can tack on your own correction, your own update, right? So that's kind of how standard residual connections work. And there's a whole approach to doing this and when and how you normalize these sums and all that stuff that we're not going to get into. The key thing here is though, that this process sort of waits every layer kind of equivalently. So if you think about it, layer one is going to take its own input and then add that to its own input times its contribution. It's weight matrix, so it's going to use to modify the input and that'll be the output that goes to layer two, right? And then layer two does the same thing. And each layer is just kind of like adding its own contribution in this very uniform accumulation of information down the line. And there's no sense in which layer three is any more important than layer 17. Now the thing is in some cases, that will be the case, right? You will have cases where one layer is actually more important for this particular prompt than another, right? It's just like, you know, maybe this is a layer that tends to worry about grammar rules or syntax, something very basic that you would tend to find in an earlier layer and maybe this is, you know, you're on a token that involves pluralization or some grammar rule, right? Where it's really relevant. So you'd want to be, you would almost want to be able to pay more attention to, that's the hint, attend more to a given layer than another. And transformers in this or standard residual connection sense don't do that, right? Again, they just kind of all, you know, take their input, they pass it on plus the input times whatever their contribution is and they pass it down a line. There's no, there's no kind of difference between the sort of impact of any given layer. There's certainly no intelligent difference. And this is what this paper is going to try to change, right? They call it attention residuals and they're going to replace this whole fixed accumulation of information with softmax attention. It's basically just a way of saying, you know, we're going to have a model essentially that looks at our layers and has a kind of attention operation that it's going to do to say, hey, you know, you should be attending more to this layer than this layer. That's the kind of 30,000 foot view. There's a bunch of details that fall out of this. The math is actually quite simple. I recommend checking it out. I'm, by the way, give me feedback on this guys, by the way, because I'm deliberately trying to be a little more concise than I was last episode to not give you an hour-long summary of this paper. But roughly speaking, that's it. Could go into more of the math if that's useful. Give me that feedback. But one of the challenges that you get with the sort of full attention residual strategy that I've just described, where you try to attend more to one layer to the another, is that it's super memory hungry at scale, because you have to keep all of the layer outputs alive in memory simultaneously. So, you know, you compute like the output of layer one and then the upper layer two and so on. And you're going to do attention over all those layers, which means you need to keep those outputs in memory so that you can run your attention calculation and decide how much more or less to wait one layer than another. And so, you end up with this really big sort of memory cost and that scales with like the number of layers. And for large models, especially when you have pipeline parallelism, that's just like super prohibitive. Pipeline parallelism is this idea where you basically break your model into chunks, where layers one to three sit on this GPU, layers four to six sit on this GPU and so on and so forth. For various reasons, this creates a bunch of communication bottlenecks. So, they work on solving for those communication bottlenecks and their solution is called block attention residuals. And basically what they do here is, they will take a group of layers. So, they'll basically break up the model into n blocks of layers and say, you know, you've got like, say eight blocks in total or something. And then they're actually going to compress each block into a single summary vector and then they'll basically apply attention only on the n block level summaries. And then that drops the memory overhead to basically the number of blocks. It scales the number of blocks rather than the number of layers. So, it gives you more control there. Okay, bunch more details. This is one of those really important papers to look at if you care about how data flows through chips in a data center, for example, how, yeah, I mean, what scales and what doesn't. This is actually a really, really important and I think interesting paper. I'll park it there, but hopefully that what's your appetite to check it out if that's your thing. And finally, we're looking at the mamba three paper. So we have mamba three, right? We were stuck on mamba two for a little while there. So mamba three improved sequence modeling using state space principles. State space principles. That word principle is actually really important. This, among other things, is an attempt to ground the mamba approach in a more like theoretically robust foundation. It'll be clear in a minute what I mean by that. But the way to think about the mamba papers in general, they are dense, they are hardware aware, which is always, I mean, I find it fun, but it means that there's a lot of complexity and mathematical complexity. This one is a lot of kind of integral calculus and finding kind of principled ways to represent state transformations that sort of like reflect in a way that the physics of how information should evolve in the system. So I'm going to get more concrete now because that was kind of kind of big. So when you think about a state space model, right? What is a state space model? I mean, to caricature it is a vector, right? So it's a list of numbers. And as your model scans over a sequence, say a sequence of text, you're going to evolve the values in that vector, in that list of numbers. And those values are going to represent, capture the meaning of what you have read, of what you've scanned over, or what the model scanned over. Okay, so now there's this question of like, all right, well, if that's the general gist, we need some kind of uptake rule, right, for that vector for that list of numbers. And well, if we look back at physics, if we look at how we think about describing, you know, like a pendulum or an electrical circuit or water flowing through pipes, or you know, these sorts of problems, like what is the form, the kind of equation that you use to govern that dynamics, right? Well, you'll typically have a state, right? So in this case, H of T, think of that as like the state. And the rate of change, so the derivative, in mathematical terms, but like, basically how that state changes over time is gonna be equal to that state, maybe modified in some way. So, but in all that means is the evolution of your state is a function of the state. So where you are going to be in a minute is a function of where you are right now. And this is pretty intuitive. I mean, like if you, you know, if you see, what are like a coyote or road runner suspended in mid-air with no, you know, nothing underneath them to hold them up, like, yeah, he's gonna fall. And the fact that he's falling is a function of where he was before, right? So in this sense, you know, the rate of change in that state or the future state of that system is a function of the state. Time some multiplying factor that does a function of time, whatever, and then plus some additional function of like the inputs to the system, the current state, the current input to the system. So the rate of change of the hidden state in a state space model is gonna be determined by the current state of the model and its current input. So the thing that in a sense perturbs that state, the new piece of information that you're seeing. So my next state space vector is going to be a function of what I've read so far, plus some modified form of the next token that I'm reading, right? This is all kind of trying to build that intuition. Now that's, that would be true, or you can describe that mathematically with a derivative with that idea of like the evolution of a system over time, smoothly, if you're working with time, which is a continuous variable. But the math kind of becomes harder. I won't take quite breaks down, but it becomes harder when we move into language. Because language models, they don't receive a continuous stream of input. You can't model them as flowing through time. Instead, they receive these discrete tokens, like word one, word two, word three. There's no in between tokens two and three, for example, right? So you have to mathematically find a way to convert this continuous time equation into a discrete recurrence equation. And that's gonna generally look similar, like you're gonna have some sort of new state that has to be a function of the old state, plus a function of the input, the most recent input. And that conversion process is called discretization, right? So it's a very common thing. You see it in a lot of context, in quantum physics, sometimes there's a variant of this that you think that's sometimes called quantization, but this kind of thing happens a lot where you take a flowing function, a smooth function that's defined over these, the real numbers basically, like you could have 0.001s and so on. And you have to convert that so that you map it onto a discrete x-axis, where you have token one, token two, and there's no in between, right? So the core question here is gonna be, how do you evolve the hidden states from token one to token two? And how do you do it in a mathematically principled way? And integral calculus, enter zenithes, I'm not gonna get into the weeds too too much here, other than to say that if you're gonna do that, as you might imagine, if you wanna like discretize a smooth function, and in other words, just basically chunk it up into these kind of discrete pieces. You kind of have a choice, a token one, you could sort of choose, roughly speaking, the sort of leftmost limit of the smooth function, the value of the smooth function that would have been there, you can sort of choose that to approximate the value of token one. You could choose the rightmost limit of your sort of discrete bar as the kind of the value that you described token one, or you could sort of average the two together. Historically, people have used this exponential Euler way, this was the Mamba two way of doing it, where they basically just like, they do the very, very first method, so they basically give it the right endpoint, so they assume that the input value is, anyway, the details don't really matter, but the input value is constant across this whole interval, and equal to its value at the right endpoint, and that basically simplifies a bunch of math, and it makes it possible for them to define their update rule, but there's a better way, basically, and it involves accounting for both the right and the left limit, doing a weighted combination of the two, so that you're not just like saying, okay, for this interval, a corresponds to token one, I'm just gonna go with whatever the rightmost fringe of the token one time boundary, yeah, mapping onto the continuous function would be, instead you balance the right and the left limits of that bar to get your value, and that's what Mamba 3 is doing, fundamentally, it's anyway one of the big changes, another one is parity, so previously, Mamba models could only represent their internal states using real numbers, and so real numbers are one kind of number, there's also imaginary numbers, and imaginary numbers, like I is the square root of negative one, they're also multiple of the square root of negative one, if you're not familiar with imaginary numbers, this is probably not the place to learn about it, but there's this deep and intimate connection between imaginary numbers and the concept of rotating stuff, and essentially what this allows the model to do, so what they're gonna do is use imaginary numbers and real numbers, sometimes referred to collectively as complex numbers, they're gonna use imaginary and real numbers together and allow the model to represent imaginary and real numbers, and in its internal state, and for interesting mathematical reasons, this makes it possible for the model to actively track property called parity, basically, if you see a sequence of zeros and ones that you feed to the model, and you ask the model, hey, if you add up all these numbers, are they even or odd? Mamba 2 would fail, because it wouldn't be able to, basically, do this rotation operation that's required to flip the parity as you count because really that's all you're going to do when you're trying to figure out if the number is even or odd and you're going to go, okay, well, like, you know, as I go, I flip every time I see a one and a zero doesn't do anything. Anyway, I'm going to just say the details don't matter. You can hopefully see this is a mathematically very interesting and elegant paper consistent with previous iterations of Mamba, but a much more principled one and the results are really impressive. It beats transformers by over two points on average on downstream accuracy across a whole bunch of benchmarks. They beat Mamba 2 by 1.9 points on those benchmarks and the same perplexity is Mamba 2 with half the state size, right? So way way faster, I mean, that means it's twice as fast at inference for like equivalent and it has this property of solving all these parity and modular arithmetic tasks that we just talked about that may have been very poorly explained, but you kind of at a certain point, it goes into, yeah, you just have to be happy with complex numbers and stuff. Bottom line is this is an interesting, interesting development. It does come with a sort of optimization. So, you know, previously Mamba used single input, single output. There's also an optimization that Mamba 3 does called Mimo multi input, multi output. This is basically an approach that helps you paralyze some of the work that the Mamba algorithm is going to do. So standard Mamba uses the single input, single output approach where the state update, well, it's updated kind of fairly inefficiently, hardware inefficiently. The GPU mostly sits idle during the decoding phase, whereas Mimo, this like multiple input, multiple output, generalizes it. So instead of like processing only one input and producing one output at a time, each layer is going to process a bunch of inputs and a bunch of outputs simultaneously using matrix multiplication way more GPU friendly. And the core thing here is it increases your GPU utilization. And you know, from a data center standpoint, that matters hugely, right? Because you're basically all your GPUs are a fleet of workers. And if you're not keeping your workers busy, it is literally the same thing from an op-ex standpoint as just like having a bunch of like employees at your company, like taking a coffee break all day. Like if they're not being utilized, then you're basically burning money like just by having them sit there. And so the fact that they're able to bump up in this case up to four times more flops during decoding during inference with no meaningful increase in wall clock time. So this doesn't actually like delay increase latency, for example, for the user, and it also leads to better model quality. So this is an important development from an efficiency standpoint, from a cost of running this model standpoint as well. So you know, we're seeing tons of hybrid models right now popping up with Mamba 2 and transform our architectures typically merged together and a whole bunch of variants, you know, sometimes you've got like like Mamba and and attention heads in the same layer, sometimes alternating Mamba attention, Mamba attention, all kinds of variants, you know, expect Mamba 3 to start getting slotted in in that whole mix. I mean, this is a really interesting development with some important new efficiency gains for anybody who wants to run it. I would expect that this will start to get taken up pretty quickly and worth keeping an eye on. So there we have it. That's the last of the two papers I wanted to cover. Hopefully I haven't boredied a tears. It was pretty damn technical. So we'll let, I guess, let Andre take it away. Thank you so much for listening to this week's episode of last week in AI. You can find the articles we discussed here today and subscribe to the newsletter at last week in that AI. We always appreciate you commenting or viewing us on Apple podcasts, share it with your friends. But more of anything, please do keep tuning in week to week. Do you need, do you need, when the AI begins, begins, begins, it's time to break. Break it down. Last week in AI coming take a ride, get the load down on tech, and let it slide, last week in AI coming take a ride, all the ads for the street, the ads reaching high, glue attack emergent, purchase surgeon flight, from the ads to the streets, the ads reaching high. with the shipment of the future fees, building up, building up your latest release, no clasps can pay eye comments that go right, get the low down on tech, and let it slide. As we pay eye comments that go right, let it slide through the streets, ay, I've reached in high. From the drone that's to robot, the headlines pop, made in driven dreams, they just don't stop, every breakthrough, every code unwritten, on the edge of change, we're excited, we're smitten, from machine learning marvels to coding kings, futures unfolding, see what it brings.", "segments": [{"id": 0, "seek": 0, "start": 0.0, "end": 16.32, "text": " Last week an AI would like to thank ODSC AI for being a sponsor. ODSC is one of the longest", "tokens": [50364, 5264, 1243, 364, 7318, 576, 411, 281, 1309, 48447, 20839, 7318, 337, 885, 257, 16198, 13, 48447, 20839, 307, 472, 295, 264, 15438, 51180], "temperature": 0.0, "avg_logprob": -0.20790546643929403, "compression_ratio": 1.4480874316939891, "no_speech_prob": 0.08781494945287704}, {"id": 1, "seek": 0, "start": 16.32, "end": 20.92, "text": " running and largest communities focused on applied data science and AI. It started", "tokens": [51180, 2614, 293, 6443, 4456, 5178, 322, 6456, 1412, 3497, 293, 7318, 13, 467, 1409, 51410], "temperature": 0.0, "avg_logprob": -0.20790546643929403, "compression_ratio": 1.4480874316939891, "no_speech_prob": 0.08781494945287704}, {"id": 2, "seek": 0, "start": 20.92, "end": 25.44, "text": " over a decade ago with a simple idea, bringing practitioners together to learn from people", "tokens": [51410, 670, 257, 10378, 2057, 365, 257, 2199, 1558, 11, 5062, 25742, 1214, 281, 1466, 490, 561, 51636], "temperature": 0.0, "avg_logprob": -0.20790546643929403, "compression_ratio": 1.4480874316939891, "no_speech_prob": 0.08781494945287704}, {"id": 3, "seek": 2544, "start": 25.44, "end": 30.240000000000002, "text": " actually building and deploying models in the real world, not just talking theory. On", "tokens": [50364, 767, 2390, 293, 34198, 5245, 294, 264, 957, 1002, 11, 406, 445, 1417, 5261, 13, 1282, 50604], "temperature": 0.0, "avg_logprob": -0.2566024590283632, "compression_ratio": 1.4661354581673307, "no_speech_prob": 0.008499298244714737}, {"id": 4, "seek": 2544, "start": 30.240000000000002, "end": 36.64, "text": " April 28th through the 30th, you can experience it yourself at ODSC East, 2026, taking place", "tokens": [50604, 6929, 7562, 392, 807, 264, 2217, 392, 11, 291, 393, 1752, 309, 1803, 412, 48447, 20839, 6747, 11, 945, 10880, 11, 1940, 1081, 50924], "temperature": 0.0, "avg_logprob": -0.2566024590283632, "compression_ratio": 1.4661354581673307, "no_speech_prob": 0.008499298244714737}, {"id": 5, "seek": 2544, "start": 36.64, "end": 42.64, "text": " in Boston and virtually there will be thousands of hybrid attendees ranging from data scientist", "tokens": [50924, 294, 12333, 293, 14103, 456, 486, 312, 5383, 295, 13051, 34826, 25532, 490, 1412, 12662, 51224], "temperature": 0.0, "avg_logprob": -0.2566024590283632, "compression_ratio": 1.4661354581673307, "no_speech_prob": 0.008499298244714737}, {"id": 6, "seek": 2544, "start": 42.64, "end": 48.36, "text": " ML engineers, AI researchers and technical leaders. You can attend over 300 sessions covering", "tokens": [51224, 21601, 11955, 11, 7318, 10309, 293, 6191, 3523, 13, 509, 393, 6888, 670, 6641, 11081, 10322, 51510], "temperature": 0.0, "avg_logprob": -0.2566024590283632, "compression_ratio": 1.4661354581673307, "no_speech_prob": 0.008499298244714737}, {"id": 7, "seek": 4836, "start": 48.36, "end": 54.92, "text": " LM's, Gen AI, computer vision, NLP, data engineering and more. You can also go to hands-on", "tokens": [50364, 46529, 311, 11, 3632, 7318, 11, 3820, 5201, 11, 426, 45196, 11, 1412, 7043, 293, 544, 13, 509, 393, 611, 352, 281, 2377, 12, 266, 50692], "temperature": 0.0, "avg_logprob": -0.22411549063744368, "compression_ratio": 1.5856164383561644, "no_speech_prob": 0.012167032808065414}, {"id": 8, "seek": 4836, "start": 54.92, "end": 60.72, "text": " training with workshops and boot camps taught by experts from companies like OpenAI, Hugging", "tokens": [50692, 3097, 365, 19162, 293, 11450, 16573, 5928, 538, 8572, 490, 3431, 411, 7238, 48698, 11, 46892, 3249, 50982], "temperature": 0.0, "avg_logprob": -0.22411549063744368, "compression_ratio": 1.5856164383561644, "no_speech_prob": 0.012167032808065414}, {"id": 9, "seek": 4836, "start": 60.72, "end": 66.24, "text": " Face and Video and other top companies and universities. And of course there will be a massive", "tokens": [50982, 4047, 293, 9777, 293, 661, 1192, 3431, 293, 11779, 13, 400, 295, 1164, 456, 486, 312, 257, 5994, 51258], "temperature": 0.0, "avg_logprob": -0.22411549063744368, "compression_ratio": 1.5856164383561644, "no_speech_prob": 0.012167032808065414}, {"id": 10, "seek": 4836, "start": 66.24, "end": 72.72, "text": " expo and networking opportunities great for startups, hiring managers and AI tool builders.", "tokens": [51258, 1278, 78, 293, 17985, 4786, 869, 337, 28041, 11, 15335, 14084, 293, 7318, 2290, 36281, 13, 51582], "temperature": 0.0, "avg_logprob": -0.22411549063744368, "compression_ratio": 1.5856164383561644, "no_speech_prob": 0.012167032808065414}, {"id": 11, "seek": 4836, "start": 72.72, "end": 77.16, "text": " It's one of the best ways for AI practitioners and teams to stay ahead of a field that learn", "tokens": [51582, 467, 311, 472, 295, 264, 1151, 2098, 337, 7318, 25742, 293, 5491, 281, 1754, 2286, 295, 257, 2519, 300, 1466, 51804], "temperature": 0.0, "avg_logprob": -0.22411549063744368, "compression_ratio": 1.5856164383561644, "no_speech_prob": 0.012167032808065414}, {"id": 12, "seek": 7716, "start": 77.16, "end": 82.84, "text": " from a best and connect with a community. Go to odsc.ai slash east and use promo code", "tokens": [50364, 490, 257, 1151, 293, 1745, 365, 257, 1768, 13, 1037, 281, 3611, 4417, 13, 1301, 17330, 10648, 293, 764, 26750, 3089, 50648], "temperature": 0.0, "avg_logprob": -0.20690566242331326, "compression_ratio": 1.5919282511210762, "no_speech_prob": 0.2816653847694397}, {"id": 13, "seek": 7716, "start": 82.84, "end": 91.75999999999999, "text": " LWAI for an additional 15% off your pass to odscai east 2026. That's odsc.ai slash east", "tokens": [50648, 441, 21449, 40, 337, 364, 4497, 2119, 4, 766, 428, 1320, 281, 3611, 4417, 1301, 10648, 945, 10880, 13, 663, 311, 3611, 4417, 13, 1301, 17330, 10648, 51094], "temperature": 0.0, "avg_logprob": -0.20690566242331326, "compression_ratio": 1.5919282511210762, "no_speech_prob": 0.2816653847694397}, {"id": 14, "seek": 7716, "start": 91.75999999999999, "end": 99.32, "text": " and use code LWAI to get an extra 15% off on the number one AI builders and training conference.", "tokens": [51094, 293, 764, 3089, 441, 21449, 40, 281, 483, 364, 2857, 2119, 4, 766, 322, 264, 1230, 472, 7318, 36281, 293, 3097, 7586, 13, 51472], "temperature": 0.0, "avg_logprob": -0.20690566242331326, "compression_ratio": 1.5919282511210762, "no_speech_prob": 0.2816653847694397}, {"id": 15, "seek": 7716, "start": 99.32, "end": 104.88, "text": " We'd like to think factor for sponsoring last week in AI. Not related to AI but I am", "tokens": [51472, 492, 1116, 411, 281, 519, 5952, 337, 30311, 1036, 1243, 294, 7318, 13, 1726, 4077, 281, 7318, 457, 286, 669, 51750], "temperature": 0.0, "avg_logprob": -0.20690566242331326, "compression_ratio": 1.5919282511210762, "no_speech_prob": 0.2816653847694397}, {"id": 16, "seek": 10488, "start": 104.88, "end": 110.44, "text": " personally a big fan. Often I have no time to cook and factor makes healthy eating easy.", "tokens": [50364, 5665, 257, 955, 3429, 13, 20043, 286, 362, 572, 565, 281, 2543, 293, 5952, 1669, 4627, 3936, 1858, 13, 50642], "temperature": 0.0, "avg_logprob": -0.24023707828136406, "compression_ratio": 1.6070175438596492, "no_speech_prob": 0.07045848667621613}, {"id": 17, "seek": 10488, "start": 110.44, "end": 116.28, "text": " They're fully prepared meals that come from their dishes and crafted by chefs. Actually", "tokens": [50642, 814, 434, 4498, 4927, 12832, 300, 808, 490, 641, 10814, 293, 36213, 538, 30191, 13, 5135, 50934], "temperature": 0.0, "avg_logprob": -0.24023707828136406, "compression_ratio": 1.6070175438596492, "no_speech_prob": 0.07045848667621613}, {"id": 18, "seek": 10488, "start": 116.28, "end": 122.03999999999999, "text": " used it for many years and I think you can really eat well without the planning or the cooking", "tokens": [50934, 1143, 309, 337, 867, 924, 293, 286, 519, 291, 393, 534, 1862, 731, 1553, 264, 5038, 420, 264, 6361, 51222], "temperature": 0.0, "avg_logprob": -0.24023707828136406, "compression_ratio": 1.6070175438596492, "no_speech_prob": 0.07045848667621613}, {"id": 19, "seek": 10488, "start": 122.03999999999999, "end": 128.68, "text": " using factor. They use quality functional ingredients like lean proteins and colorful veggies", "tokens": [51222, 1228, 5952, 13, 814, 764, 3125, 11745, 6952, 411, 11659, 15577, 293, 18506, 27889, 51554], "temperature": 0.0, "avg_logprob": -0.24023707828136406, "compression_ratio": 1.6070175438596492, "no_speech_prob": 0.07045848667621613}, {"id": 20, "seek": 10488, "start": 128.68, "end": 134.84, "text": " and there are no refined sugars or artificial sweeteners. You have 100 rotating weekly meals", "tokens": [51554, 293, 456, 366, 572, 26201, 37551, 420, 11677, 3844, 268, 433, 13, 509, 362, 2319, 4297, 990, 12460, 12832, 51862], "temperature": 0.0, "avg_logprob": -0.24023707828136406, "compression_ratio": 1.6070175438596492, "no_speech_prob": 0.07045848667621613}, {"id": 21, "seek": 13484, "start": 134.84, "end": 140.4, "text": " to keep things fresh and you can choose types of meals like high protein, calorie smart,", "tokens": [50364, 281, 1066, 721, 4451, 293, 291, 393, 2826, 3467, 295, 12832, 411, 1090, 7944, 11, 35004, 4069, 11, 50642], "temperature": 0.0, "avg_logprob": -0.182827275068987, "compression_ratio": 1.5619469026548674, "no_speech_prob": 0.03540248051285744}, {"id": 22, "seek": 13484, "start": 140.4, "end": 146.8, "text": " Mediterranean and others. It's really convenient ready in about two minutes. In my experience", "tokens": [50642, 27280, 293, 2357, 13, 467, 311, 534, 10851, 1919, 294, 466, 732, 2077, 13, 682, 452, 1752, 50962], "temperature": 0.0, "avg_logprob": -0.182827275068987, "compression_ratio": 1.5619469026548674, "no_speech_prob": 0.03540248051285744}, {"id": 23, "seek": 13484, "start": 146.8, "end": 152.32, "text": " really is fair to say that there's no prep necessary and it's quite good. I've used it", "tokens": [50962, 534, 307, 3143, 281, 584, 300, 456, 311, 572, 2666, 4818, 293, 309, 311, 1596, 665, 13, 286, 600, 1143, 309, 51238], "temperature": 0.0, "avg_logprob": -0.182827275068987, "compression_ratio": 1.5619469026548674, "no_speech_prob": 0.03540248051285744}, {"id": 24, "seek": 13484, "start": 152.32, "end": 157.4, "text": " for many years and I think you might want to consider it if it fits your lifestyle.", "tokens": [51238, 337, 867, 924, 293, 286, 519, 291, 1062, 528, 281, 1949, 309, 498, 309, 9001, 428, 11716, 13, 51492], "temperature": 0.0, "avg_logprob": -0.182827275068987, "compression_ratio": 1.5619469026548674, "no_speech_prob": 0.03540248051285744}, {"id": 25, "seek": 15740, "start": 157.4, "end": 165.72, "text": " You had to factor meals dot com slash LWAI 50 off and use code LWAI 50 off to get 50% off", "tokens": [50364, 509, 632, 281, 5952, 12832, 5893, 395, 17330, 441, 21449, 40, 2625, 766, 293, 764, 3089, 441, 21449, 40, 2625, 766, 281, 483, 2625, 4, 766, 50780], "temperature": 0.0, "avg_logprob": -0.3280636077781893, "compression_ratio": 1.5851063829787233, "no_speech_prob": 0.5934733152389526}, {"id": 26, "seek": 15740, "start": 165.72, "end": 171.24, "text": " and free breakfast for a year eat like a provisement with factor. You serve coverage only", "tokens": [50780, 293, 1737, 8201, 337, 257, 1064, 1862, 411, 257, 1439, 271, 1712, 365, 5952, 13, 509, 4596, 9645, 787, 51056], "temperature": 0.0, "avg_logprob": -0.3280636077781893, "compression_ratio": 1.5851063829787233, "no_speech_prob": 0.5934733152389526}, {"id": 27, "seek": 15740, "start": 171.24, "end": 175.44, "text": " veg by plan. What free breakfast time per box for one year while subscription is active.", "tokens": [51056, 24366, 538, 1393, 13, 708, 1737, 8201, 565, 680, 2424, 337, 472, 1064, 1339, 17231, 307, 4967, 13, 51266], "temperature": 0.0, "avg_logprob": -0.3280636077781893, "compression_ratio": 1.5851063829787233, "no_speech_prob": 0.5934733152389526}, {"id": 28, "seek": 15740, "start": 175.44, "end": 179.56, "text": " At Arizona State University we're bringing world class education from our globally acclaimed", "tokens": [51266, 1711, 14723, 4533, 3535, 321, 434, 5062, 1002, 1508, 3309, 490, 527, 18958, 1317, 22642, 51472], "temperature": 0.0, "avg_logprob": -0.3280636077781893, "compression_ratio": 1.5851063829787233, "no_speech_prob": 0.5934733152389526}, {"id": 29, "seek": 15740, "start": 179.56, "end": 185.4, "text": " faculty to you. Earn your degree from the nation's most innovative university online.", "tokens": [51472, 6389, 281, 291, 13, 24820, 428, 4314, 490, 264, 4790, 311, 881, 12999, 5454, 2950, 13, 51764], "temperature": 0.0, "avg_logprob": -0.3280636077781893, "compression_ratio": 1.5851063829787233, "no_speech_prob": 0.5934733152389526}, {"id": 30, "seek": 18540, "start": 185.4, "end": 189.52, "text": " It's a degree better. Learn more at ASU online dot ASU dot edu.", "tokens": [50364, 467, 311, 257, 4314, 1101, 13, 17216, 544, 412, 7469, 52, 2950, 5893, 7469, 52, 5893, 1257, 84, 13, 50570], "temperature": 0.0, "avg_logprob": -0.3341626606949972, "compression_ratio": 1.6245059288537549, "no_speech_prob": 0.5916141867637634}, {"id": 31, "seek": 18540, "start": 189.52, "end": 195.64000000000001, "text": " Hello and welcome to the last week in AI podcast week in the air chat about what's going", "tokens": [50570, 2425, 293, 2928, 281, 264, 1036, 1243, 294, 7318, 7367, 1243, 294, 264, 1988, 5081, 466, 437, 311, 516, 50876], "temperature": 0.0, "avg_logprob": -0.3341626606949972, "compression_ratio": 1.6245059288537549, "no_speech_prob": 0.5916141867637634}, {"id": 32, "seek": 18540, "start": 195.64000000000001, "end": 201.24, "text": " on with AI. As usual in this episode we will summarize and discuss some of last week's", "tokens": [50876, 322, 365, 7318, 13, 1018, 7713, 294, 341, 3500, 321, 486, 20858, 293, 2248, 512, 295, 1036, 1243, 311, 51156], "temperature": 0.0, "avg_logprob": -0.3341626606949972, "compression_ratio": 1.6245059288537549, "no_speech_prob": 0.5916141867637634}, {"id": 33, "seek": 18540, "start": 201.24, "end": 206.84, "text": " most interesting AI news. You can also go to last week in dot AI for our newsletter", "tokens": [51156, 881, 1880, 7318, 2583, 13, 509, 393, 611, 352, 281, 1036, 1243, 294, 5893, 7318, 337, 527, 26469, 51436], "temperature": 0.0, "avg_logprob": -0.3341626606949972, "compression_ratio": 1.6245059288537549, "no_speech_prob": 0.5916141867637634}, {"id": 34, "seek": 18540, "start": 206.84, "end": 212.84, "text": " with even more news every week. I'm one of your regular hosts, Andre Crankov. I studied", "tokens": [51436, 365, 754, 544, 2583, 633, 1243, 13, 286, 478, 472, 295, 428, 3890, 21573, 11, 20667, 4779, 657, 5179, 13, 286, 9454, 51736], "temperature": 0.0, "avg_logprob": -0.3341626606949972, "compression_ratio": 1.6245059288537549, "no_speech_prob": 0.5916141867637634}, {"id": 35, "seek": 21284, "start": 212.84, "end": 218.28, "text": " AI in grad school and now work at the startup AstroKate. I'm your other host, Jeremy Harris,", "tokens": [50364, 7318, 294, 2771, 1395, 293, 586, 589, 412, 264, 18578, 12884, 340, 42, 473, 13, 286, 478, 428, 661, 3975, 11, 17809, 17426, 11, 50636], "temperature": 0.0, "avg_logprob": -0.281742057180017, "compression_ratio": 1.5650684931506849, "no_speech_prob": 0.06309641897678375}, {"id": 36, "seek": 21284, "start": 218.28, "end": 224.04, "text": " from Gladstone AI, AI national security, AI loss control, all those fun things. By the way,", "tokens": [50636, 490, 28301, 11243, 7318, 11, 7318, 4048, 3825, 11, 7318, 4470, 1969, 11, 439, 729, 1019, 721, 13, 3146, 264, 636, 11, 50924], "temperature": 0.0, "avg_logprob": -0.281742057180017, "compression_ratio": 1.5650684931506849, "no_speech_prob": 0.06309641897678375}, {"id": 37, "seek": 21284, "start": 224.04, "end": 229.08, "text": " special thanks to Andre for recording at this time. We bumped things up even earlier.", "tokens": [50924, 2121, 3231, 281, 20667, 337, 6613, 412, 341, 565, 13, 492, 42696, 721, 493, 754, 3071, 13, 51176], "temperature": 0.0, "avg_logprob": -0.281742057180017, "compression_ratio": 1.5650684931506849, "no_speech_prob": 0.06309641897678375}, {"id": 38, "seek": 21284, "start": 229.08, "end": 235.64000000000001, "text": " I think it's like what is it seven thirty year time? Is that it is here on the PST. Sometimes", "tokens": [51176, 286, 519, 309, 311, 411, 437, 307, 309, 3407, 11790, 1064, 565, 30, 1119, 300, 309, 307, 510, 322, 264, 430, 6840, 13, 4803, 51504], "temperature": 0.0, "avg_logprob": -0.281742057180017, "compression_ratio": 1.5650684931506849, "no_speech_prob": 0.06309641897678375}, {"id": 39, "seek": 21284, "start": 235.64000000000001, "end": 242.2, "text": " it's nice to be East Coast to be a bit later. But you know it's good to get your day started", "tokens": [51504, 309, 311, 1481, 281, 312, 6747, 14960, 281, 312, 257, 857, 1780, 13, 583, 291, 458, 309, 311, 665, 281, 483, 428, 786, 1409, 51832], "temperature": 0.0, "avg_logprob": -0.281742057180017, "compression_ratio": 1.5650684931506849, "no_speech_prob": 0.06309641897678375}, {"id": 40, "seek": 24220, "start": 242.20000000000002, "end": 248.28000000000003, "text": " early sometimes. Yeah, much appreciated. Also appreciate people tuning in or watching the entire", "tokens": [50364, 2440, 2171, 13, 865, 11, 709, 17169, 13, 2743, 4449, 561, 15164, 294, 420, 1976, 264, 2302, 50668], "temperature": 0.0, "avg_logprob": -0.1594121270282294, "compression_ratio": 1.588235294117647, "no_speech_prob": 0.013665665872395039}, {"id": 41, "seek": 24220, "start": 248.28000000000003, "end": 254.44000000000003, "text": " last podcast, which featured a extra like hour plus, which I didn't realize it was that long,", "tokens": [50668, 1036, 7367, 11, 597, 13822, 257, 2857, 411, 1773, 1804, 11, 597, 286, 994, 380, 4325, 309, 390, 300, 938, 11, 50976], "temperature": 0.0, "avg_logprob": -0.1594121270282294, "compression_ratio": 1.588235294117647, "no_speech_prob": 0.013665665872395039}, {"id": 42, "seek": 24220, "start": 254.44000000000003, "end": 258.52000000000004, "text": " but Andre had a hop off. So I just went through like some of the technical papers we didn't", "tokens": [50976, 457, 20667, 632, 257, 3818, 766, 13, 407, 286, 445, 1437, 807, 411, 512, 295, 264, 6191, 10577, 321, 994, 380, 51180], "temperature": 0.0, "avg_logprob": -0.1594121270282294, "compression_ratio": 1.588235294117647, "no_speech_prob": 0.013665665872395039}, {"id": 43, "seek": 24220, "start": 258.52000000000004, "end": 265.56, "text": " cover as an experiment. And man did I go on. So I have learned that I need Andre to be like the", "tokens": [51180, 2060, 382, 364, 5120, 13, 400, 587, 630, 286, 352, 322, 13, 407, 286, 362, 3264, 300, 286, 643, 20667, 281, 312, 411, 264, 51532], "temperature": 0.0, "avg_logprob": -0.1594121270282294, "compression_ratio": 1.588235294117647, "no_speech_prob": 0.013665665872395039}, {"id": 44, "seek": 26556, "start": 265.56, "end": 272.28000000000003, "text": " regularizing term to my loss function. If that is. Yeah, we do know some people are very much", "tokens": [50364, 3890, 3319, 1433, 281, 452, 4470, 2445, 13, 759, 300, 307, 13, 865, 11, 321, 360, 458, 512, 561, 366, 588, 709, 50700], "temperature": 0.0, "avg_logprob": -0.18240587557515792, "compression_ratio": 1.6075949367088607, "no_speech_prob": 0.0127360625192523}, {"id": 45, "seek": 26556, "start": 272.28000000000003, "end": 279.24, "text": " fans of the coverage of research and I was going in depth. So feel free to comment on YouTube", "tokens": [50700, 4499, 295, 264, 9645, 295, 2132, 293, 286, 390, 516, 294, 7161, 13, 407, 841, 1737, 281, 2871, 322, 3088, 51048], "temperature": 0.0, "avg_logprob": -0.18240587557515792, "compression_ratio": 1.6075949367088607, "no_speech_prob": 0.0127360625192523}, {"id": 46, "seek": 26556, "start": 279.24, "end": 287.24, "text": " or elsewhere. And yeah, say if you want more of that, you've considered maybe having additional", "tokens": [51048, 420, 14517, 13, 400, 1338, 11, 584, 498, 291, 528, 544, 295, 300, 11, 291, 600, 4888, 1310, 1419, 4497, 51448], "temperature": 0.0, "avg_logprob": -0.18240587557515792, "compression_ratio": 1.6075949367088607, "no_speech_prob": 0.0127360625192523}, {"id": 47, "seek": 26556, "start": 287.24, "end": 292.84000000000003, "text": " episodes that are just research, very much happening there. But feel free to let us know if you'd", "tokens": [51448, 9313, 300, 366, 445, 2132, 11, 588, 709, 2737, 456, 13, 583, 841, 1737, 281, 718, 505, 458, 498, 291, 1116, 51728], "temperature": 0.0, "avg_logprob": -0.18240587557515792, "compression_ratio": 1.6075949367088607, "no_speech_prob": 0.0127360625192523}, {"id": 48, "seek": 29284, "start": 292.84000000000003, "end": 299.32000000000005, "text": " like even more research on a regular basis. And just to give a quick preview of what we'll be", "tokens": [50364, 411, 754, 544, 2132, 322, 257, 3890, 5143, 13, 400, 445, 281, 976, 257, 1702, 14281, 295, 437, 321, 603, 312, 50688], "temperature": 0.0, "avg_logprob": -0.09633807281216422, "compression_ratio": 1.6150627615062763, "no_speech_prob": 0.004424402955919504}, {"id": 49, "seek": 29284, "start": 299.32000000000005, "end": 307.00000000000006, "text": " doing this episode, not as much research as last one. There's a bit of everything I suppose. There's", "tokens": [50688, 884, 341, 3500, 11, 406, 382, 709, 2132, 382, 1036, 472, 13, 821, 311, 257, 857, 295, 1203, 286, 7297, 13, 821, 311, 51072], "temperature": 0.0, "avg_logprob": -0.09633807281216422, "compression_ratio": 1.6150627615062763, "no_speech_prob": 0.004424402955919504}, {"id": 50, "seek": 29284, "start": 307.00000000000006, "end": 313.08000000000004, "text": " a couple new model releases and some other kind of interesting tools. There's some interesting", "tokens": [51072, 257, 1916, 777, 2316, 16952, 293, 512, 661, 733, 295, 1880, 3873, 13, 821, 311, 512, 1880, 51376], "temperature": 0.0, "avg_logprob": -0.09633807281216422, "compression_ratio": 1.6150627615062763, "no_speech_prob": 0.004424402955919504}, {"id": 51, "seek": 29284, "start": 313.08000000000004, "end": 319.8, "text": " developments on the business front with OpenAI. And then we've been covering a lot of safety and", "tokens": [51376, 20862, 322, 264, 1606, 1868, 365, 7238, 48698, 13, 400, 550, 321, 600, 668, 10322, 257, 688, 295, 4514, 293, 51712], "temperature": 0.0, "avg_logprob": -0.09633807281216422, "compression_ratio": 1.6150627615062763, "no_speech_prob": 0.004424402955919504}, {"id": 52, "seek": 31980, "start": 319.8, "end": 325.96000000000004, "text": " interpretability work lately on alignment. So we'll have some of that. And then towards Van,", "tokens": [50364, 7302, 2310, 589, 12881, 322, 18515, 13, 407, 321, 603, 362, 512, 295, 300, 13, 400, 550, 3030, 8979, 11, 50672], "temperature": 0.0, "avg_logprob": -0.22487702574392762, "compression_ratio": 1.5138339920948616, "no_speech_prob": 0.008393686264753342}, {"id": 53, "seek": 31980, "start": 325.96000000000004, "end": 332.68, "text": " we'll have some fairly interesting impact for looking research. Let's feel radical more sort of like", "tokens": [50672, 321, 603, 362, 512, 6457, 1880, 2712, 337, 1237, 2132, 13, 961, 311, 841, 12001, 544, 1333, 295, 411, 51008], "temperature": 0.0, "avg_logprob": -0.22487702574392762, "compression_ratio": 1.5138339920948616, "no_speech_prob": 0.008393686264753342}, {"id": 54, "seek": 31980, "start": 332.68, "end": 340.68, "text": " wow, this might actually be a big deal. So it should be a pretty fun listen. And let's kick it", "tokens": [51008, 6076, 11, 341, 1062, 767, 312, 257, 955, 2028, 13, 407, 309, 820, 312, 257, 1238, 1019, 2140, 13, 400, 718, 311, 4437, 309, 51408], "temperature": 0.0, "avg_logprob": -0.22487702574392762, "compression_ratio": 1.5138339920948616, "no_speech_prob": 0.008393686264753342}, {"id": 55, "seek": 31980, "start": 340.68, "end": 349.48, "text": " off with tools and apps. First up, OpenAI, they have shipped a Jubilee 5.4 Mini and Nano. They", "tokens": [51408, 766, 365, 3873, 293, 7733, 13, 2386, 493, 11, 7238, 48698, 11, 436, 362, 25312, 257, 43560, 33914, 1025, 13, 19, 18239, 293, 43511, 13, 814, 51848], "temperature": 0.0, "avg_logprob": -0.22487702574392762, "compression_ratio": 1.5138339920948616, "no_speech_prob": 0.008393686264753342}, {"id": 56, "seek": 34948, "start": 349.48, "end": 356.04, "text": " are similarly to other small models that we've seen in, in recent times, like actually really", "tokens": [50364, 366, 14138, 281, 661, 1359, 5245, 300, 321, 600, 1612, 294, 11, 294, 5162, 1413, 11, 411, 767, 534, 50692], "temperature": 0.0, "avg_logprob": -0.21104454477628073, "compression_ratio": 1.3819095477386936, "no_speech_prob": 0.002395826391875744}, {"id": 57, "seek": 34948, "start": 356.92, "end": 366.12, "text": " quite good for being kind of a smaller range. Jubilee 5.4 Mini is close to Jubilee 5.4 on", "tokens": [50736, 1596, 665, 337, 885, 733, 295, 257, 4356, 3613, 13, 43560, 33914, 1025, 13, 19, 18239, 307, 1998, 281, 43560, 33914, 1025, 13, 19, 322, 51196], "temperature": 0.0, "avg_logprob": -0.21104454477628073, "compression_ratio": 1.3819095477386936, "no_speech_prob": 0.002395826391875744}, {"id": 58, "seek": 34948, "start": 366.12, "end": 373.8, "text": " several benchmarks, including on SWE Bench Pro and OS World. They're fine. And it's more of", "tokens": [51196, 2940, 43751, 11, 3009, 322, 20346, 36, 3964, 339, 1705, 293, 12731, 3937, 13, 814, 434, 2489, 13, 400, 309, 311, 544, 295, 51580], "temperature": 0.0, "avg_logprob": -0.21104454477628073, "compression_ratio": 1.3819095477386936, "no_speech_prob": 0.002395826391875744}, {"id": 59, "seek": 37380, "start": 373.96000000000004, "end": 382.68, "text": " and twice as fast. Jubilee 5.4 Nano is obviously the smallest option. It's not really doing great", "tokens": [50372, 293, 6091, 382, 2370, 13, 43560, 33914, 1025, 13, 19, 43511, 307, 2745, 264, 16998, 3614, 13, 467, 311, 406, 534, 884, 869, 50808], "temperature": 0.0, "avg_logprob": -0.15786145772371027, "compression_ratio": 1.3560975609756099, "no_speech_prob": 0.021036339923739433}, {"id": 60, "seek": 37380, "start": 382.68, "end": 390.92, "text": " on the benchmarks, but it is super quick. These models have 400,000 token context middles,", "tokens": [50808, 322, 264, 43751, 11, 457, 309, 307, 1687, 1702, 13, 1981, 5245, 362, 8423, 11, 1360, 14862, 4319, 2062, 21915, 11, 51220], "temperature": 0.0, "avg_logprob": -0.15786145772371027, "compression_ratio": 1.3560975609756099, "no_speech_prob": 0.021036339923739433}, {"id": 61, "seek": 37380, "start": 390.92, "end": 401.08000000000004, "text": " so fairly substantial. But they do cost a decent amount, relative to GP5 Mini. Looks like", "tokens": [51220, 370, 6457, 16726, 13, 583, 436, 360, 2063, 257, 8681, 2372, 11, 4972, 281, 26039, 20, 18239, 13, 10027, 411, 51728], "temperature": 0.0, "avg_logprob": -0.15786145772371027, "compression_ratio": 1.3560975609756099, "no_speech_prob": 0.021036339923739433}, {"id": 62, "seek": 40108, "start": 401.08, "end": 409.47999999999996, "text": " GP5.4 Mini costs free X, GP5 Mini, and GP5 Nano also costs more. So on the whole, good,", "tokens": [50364, 26039, 20, 13, 19, 18239, 5497, 1737, 1783, 11, 26039, 20, 18239, 11, 293, 26039, 20, 43511, 611, 5497, 544, 13, 407, 322, 264, 1379, 11, 665, 11, 50784], "temperature": 0.0, "avg_logprob": -0.17610856808653666, "compression_ratio": 1.5772357723577235, "no_speech_prob": 0.0038466639816761017}, {"id": 63, "seek": 40108, "start": 409.47999999999996, "end": 416.91999999999996, "text": " faster, smaller models, if you need something that's doing better in GP5 Mini, now you have that", "tokens": [50784, 4663, 11, 4356, 5245, 11, 498, 291, 643, 746, 300, 311, 884, 1101, 294, 26039, 20, 18239, 11, 586, 291, 362, 300, 51156], "temperature": 0.0, "avg_logprob": -0.17610856808653666, "compression_ratio": 1.5772357723577235, "no_speech_prob": 0.0038466639816761017}, {"id": 64, "seek": 40108, "start": 416.91999999999996, "end": 422.28, "text": " option. There's been a lot made about the cost situation and the pricing of the per token pricing,", "tokens": [51156, 3614, 13, 821, 311, 668, 257, 688, 1027, 466, 264, 2063, 2590, 293, 264, 17621, 295, 264, 680, 14862, 17621, 11, 51424], "temperature": 0.0, "avg_logprob": -0.17610856808653666, "compression_ratio": 1.5772357723577235, "no_speech_prob": 0.0038466639816761017}, {"id": 65, "seek": 40108, "start": 422.28, "end": 429.79999999999995, "text": " I should say. It is higher. There's no question, right? So GP5.4 is basically three quarters of a penny.", "tokens": [51424, 286, 820, 584, 13, 467, 307, 2946, 13, 821, 311, 572, 1168, 11, 558, 30, 407, 26039, 20, 13, 19, 307, 1936, 1045, 20612, 295, 257, 24178, 13, 51800], "temperature": 0.0, "avg_logprob": -0.17610856808653666, "compression_ratio": 1.5772357723577235, "no_speech_prob": 0.0038466639816761017}, {"id": 66, "seek": 42980, "start": 429.8, "end": 435.8, "text": " Sorry, three quarters of a dollar per million in put tokens versus 25 cents for GP5 Mini. So", "tokens": [50364, 4919, 11, 1045, 20612, 295, 257, 7241, 680, 2459, 294, 829, 22667, 5717, 3552, 14941, 337, 26039, 20, 18239, 13, 407, 50664], "temperature": 0.0, "avg_logprob": -0.17532889650562616, "compression_ratio": 1.5825242718446602, "no_speech_prob": 0.002942252904176712}, {"id": 67, "seek": 42980, "start": 435.8, "end": 443.16, "text": " that's a three X hike. But opening I says it only burns about 30% of the GP5.4 in codex. So it's", "tokens": [50664, 300, 311, 257, 1045, 1783, 23282, 13, 583, 5193, 286, 1619, 309, 787, 22684, 466, 2217, 4, 295, 264, 26039, 20, 13, 19, 294, 3089, 87, 13, 407, 309, 311, 51032], "temperature": 0.0, "avg_logprob": -0.17532889650562616, "compression_ratio": 1.5825242718446602, "no_speech_prob": 0.002942252904176712}, {"id": 68, "seek": 42980, "start": 443.16, "end": 448.12, "text": " actually going to be much more token efficient. And this is a metric that I think matters a lot more", "tokens": [51032, 767, 516, 281, 312, 709, 544, 14862, 7148, 13, 400, 341, 307, 257, 20678, 300, 286, 519, 7001, 257, 688, 544, 51280], "temperature": 0.0, "avg_logprob": -0.17532889650562616, "compression_ratio": 1.5825242718446602, "no_speech_prob": 0.002942252904176712}, {"id": 69, "seek": 42980, "start": 448.12, "end": 453.56, "text": " than many people will tend to realize, right? So we have lost and like cost for token. But as we've", "tokens": [51280, 813, 867, 561, 486, 3928, 281, 4325, 11, 558, 30, 407, 321, 362, 2731, 293, 411, 2063, 337, 14862, 13, 583, 382, 321, 600, 51552], "temperature": 0.0, "avg_logprob": -0.17532889650562616, "compression_ratio": 1.5825242718446602, "no_speech_prob": 0.002942252904176712}, {"id": 70, "seek": 42980, "start": 453.56, "end": 458.92, "text": " seen, more tokens at inference time does not necessarily mean more performance. And that's the big", "tokens": [51552, 1612, 11, 544, 22667, 412, 38253, 565, 775, 406, 4725, 914, 544, 3389, 13, 400, 300, 311, 264, 955, 51820], "temperature": 0.0, "avg_logprob": -0.17532889650562616, "compression_ratio": 1.5825242718446602, "no_speech_prob": 0.002942252904176712}, {"id": 71, "seek": 45892, "start": 458.92, "end": 464.84000000000003, "text": " catcher. So when you multiply those together, right? 30% times three acts, you actually get a slight", "tokens": [50364, 3745, 260, 13, 407, 562, 291, 12972, 729, 1214, 11, 558, 30, 2217, 4, 1413, 1045, 10672, 11, 291, 767, 483, 257, 4036, 50660], "temperature": 0.0, "avg_logprob": -0.11306601685026417, "compression_ratio": 1.6533333333333333, "no_speech_prob": 0.001260128803551197}, {"id": 72, "seek": 45892, "start": 464.84000000000003, "end": 470.36, "text": " decrease in what you might think of as cost per performance, which is a little closer to what", "tokens": [50660, 11514, 294, 437, 291, 1062, 519, 295, 382, 2063, 680, 3389, 11, 597, 307, 257, 707, 4966, 281, 437, 50936], "temperature": 0.0, "avg_logprob": -0.11306601685026417, "compression_ratio": 1.6533333333333333, "no_speech_prob": 0.001260128803551197}, {"id": 73, "seek": 45892, "start": 470.36, "end": 473.96000000000004, "text": " most people care about. And this will vary depending on the workload, but still quite interesting,", "tokens": [50936, 881, 561, 1127, 466, 13, 400, 341, 486, 10559, 5413, 322, 264, 20139, 11, 457, 920, 1596, 1880, 11, 51116], "temperature": 0.0, "avg_logprob": -0.11306601685026417, "compression_ratio": 1.6533333333333333, "no_speech_prob": 0.001260128803551197}, {"id": 74, "seek": 45892, "start": 473.96000000000004, "end": 478.84000000000003, "text": " right? So for the sort of orchestrated agentic tasks, the effective cost per outcome could actually be", "tokens": [51116, 558, 30, 407, 337, 264, 1333, 295, 14161, 5468, 9461, 299, 9608, 11, 264, 4942, 2063, 680, 9700, 727, 767, 312, 51360], "temperature": 0.0, "avg_logprob": -0.11306601685026417, "compression_ratio": 1.6533333333333333, "no_speech_prob": 0.001260128803551197}, {"id": 75, "seek": 45892, "start": 478.84000000000003, "end": 482.6, "text": " favorable compared to running the full model. So it's a sort of interesting there. One thing I will", "tokens": [51360, 29557, 5347, 281, 2614, 264, 1577, 2316, 13, 407, 309, 311, 257, 1333, 295, 1880, 456, 13, 1485, 551, 286, 486, 51548], "temperature": 0.0, "avg_logprob": -0.11306601685026417, "compression_ratio": 1.6533333333333333, "no_speech_prob": 0.001260128803551197}, {"id": 76, "seek": 48260, "start": 482.6, "end": 491.96000000000004, "text": " say nano is API only. This one is priced at 20 cents per input and $1.25 per million output tokens", "tokens": [50364, 584, 30129, 307, 9362, 787, 13, 639, 472, 307, 30349, 412, 945, 14941, 680, 4846, 293, 1848, 16, 13, 6074, 680, 2459, 5598, 22667, 50832], "temperature": 0.0, "avg_logprob": -0.15829348623162448, "compression_ratio": 1.5846774193548387, "no_speech_prob": 0.0018924230244010687}, {"id": 77, "seek": 48260, "start": 491.96000000000004, "end": 498.92, "text": " versus like much, much cheaper. So it's predecessor was 5 cents per input token. That's a 4x lift,", "tokens": [50832, 5717, 411, 709, 11, 709, 12284, 13, 407, 309, 311, 34991, 390, 1025, 14941, 680, 4846, 14862, 13, 663, 311, 257, 1017, 87, 5533, 11, 51180], "temperature": 0.0, "avg_logprob": -0.15829348623162448, "compression_ratio": 1.5846774193548387, "no_speech_prob": 0.0018924230244010687}, {"id": 78, "seek": 48260, "start": 498.92, "end": 503.96000000000004, "text": " roughly 4x again for output tokens. Well, so you're looking at 4x overall. The weird thing is,", "tokens": [51180, 9810, 1017, 87, 797, 337, 5598, 22667, 13, 1042, 11, 370, 291, 434, 1237, 412, 1017, 87, 4787, 13, 440, 3657, 551, 307, 11, 51432], "temperature": 0.0, "avg_logprob": -0.15829348623162448, "compression_ratio": 1.5846774193548387, "no_speech_prob": 0.0018924230244010687}, {"id": 79, "seek": 48260, "start": 503.96000000000004, "end": 511.96000000000004, "text": " so opening I is pitching this for classification and data extraction, which are these like very high", "tokens": [51432, 370, 5193, 286, 307, 37499, 341, 337, 21538, 293, 1412, 30197, 11, 597, 366, 613, 411, 588, 1090, 51832], "temperature": 0.0, "avg_logprob": -0.15829348623162448, "compression_ratio": 1.5846774193548387, "no_speech_prob": 0.0018924230244010687}, {"id": 80, "seek": 51196, "start": 511.96000000000004, "end": 516.6800000000001, "text": " volume workloads where you're usually quite cost sensitive because you're processing you're so much.", "tokens": [50364, 5523, 32452, 689, 291, 434, 2673, 1596, 2063, 9477, 570, 291, 434, 9007, 291, 434, 370, 709, 13, 50600], "temperature": 0.0, "avg_logprob": -0.13597281202137898, "compression_ratio": 1.7331378299120235, "no_speech_prob": 0.0014440249651670456}, {"id": 81, "seek": 51196, "start": 516.6800000000001, "end": 522.44, "text": " And so that four full hike is going to sting the most for exactly the people who are being pitched", "tokens": [50600, 400, 370, 300, 1451, 1577, 23282, 307, 516, 281, 27175, 264, 881, 337, 2293, 264, 561, 567, 366, 885, 32994, 50888], "temperature": 0.0, "avg_logprob": -0.13597281202137898, "compression_ratio": 1.7331378299120235, "no_speech_prob": 0.0014440249651670456}, {"id": 82, "seek": 51196, "start": 522.44, "end": 527.72, "text": " this product. So it's a bit of an interesting position. It seems like a little, I don't want to say", "tokens": [50888, 341, 1674, 13, 407, 309, 311, 257, 857, 295, 364, 1880, 2535, 13, 467, 2544, 411, 257, 707, 11, 286, 500, 380, 528, 281, 584, 51152], "temperature": 0.0, "avg_logprob": -0.13597281202137898, "compression_ratio": 1.7331378299120235, "no_speech_prob": 0.0014440249651670456}, {"id": 83, "seek": 51196, "start": 527.72, "end": 532.36, "text": " at odds with open AI's position that they want to make intelligence too cheap to meter, but it", "tokens": [51152, 412, 17439, 365, 1269, 7318, 311, 2535, 300, 436, 528, 281, 652, 7599, 886, 7084, 281, 9255, 11, 457, 309, 51384], "temperature": 0.0, "avg_logprob": -0.13597281202137898, "compression_ratio": 1.7331378299120235, "no_speech_prob": 0.0014440249651670456}, {"id": 84, "seek": 51196, "start": 532.36, "end": 537.08, "text": " certainly is at least locally. This is a move towards instead of racing to the bottom on inference", "tokens": [51384, 3297, 307, 412, 1935, 16143, 13, 639, 307, 257, 1286, 3030, 2602, 295, 12553, 281, 264, 2767, 322, 38253, 51620], "temperature": 0.0, "avg_logprob": -0.13597281202137898, "compression_ratio": 1.7331378299120235, "no_speech_prob": 0.0014440249651670456}, {"id": 85, "seek": 51196, "start": 537.08, "end": 541.08, "text": " costs. We're going to focus on model quality. And that that's going to be our big differentiator.", "tokens": [51620, 5497, 13, 492, 434, 516, 281, 1879, 322, 2316, 3125, 13, 400, 300, 300, 311, 516, 281, 312, 527, 955, 27372, 1639, 13, 51820], "temperature": 0.0, "avg_logprob": -0.13597281202137898, "compression_ratio": 1.7331378299120235, "no_speech_prob": 0.0014440249651670456}, {"id": 86, "seek": 54108, "start": 541.08, "end": 544.44, "text": " You're going to care that we can get the right answer, not that we can get it cheaply,", "tokens": [50364, 509, 434, 516, 281, 1127, 300, 321, 393, 483, 264, 558, 1867, 11, 406, 300, 321, 393, 483, 309, 7084, 356, 11, 50532], "temperature": 0.0, "avg_logprob": -0.10366150123231552, "compression_ratio": 1.7968253968253969, "no_speech_prob": 0.003415761748328805}, {"id": 87, "seek": 54108, "start": 544.44, "end": 548.6800000000001, "text": " which is where all the margin is. That at least is what anthropic certainly suggests and what", "tokens": [50532, 597, 307, 689, 439, 264, 10270, 307, 13, 663, 412, 1935, 307, 437, 22727, 299, 3297, 13409, 293, 437, 50744], "temperature": 0.0, "avg_logprob": -0.10366150123231552, "compression_ratio": 1.7968253968253969, "no_speech_prob": 0.003415761748328805}, {"id": 88, "seek": 54108, "start": 548.6800000000001, "end": 551.8000000000001, "text": " we're seeing elsewhere. So that's kind of interesting. You know, there's a bunch of interesting,", "tokens": [50744, 321, 434, 2577, 14517, 13, 407, 300, 311, 733, 295, 1880, 13, 509, 458, 11, 456, 311, 257, 3840, 295, 1880, 11, 50900], "temperature": 0.0, "avg_logprob": -0.10366150123231552, "compression_ratio": 1.7968253968253969, "no_speech_prob": 0.003415761748328805}, {"id": 89, "seek": 54108, "start": 551.8000000000001, "end": 556.2800000000001, "text": " as you said, OS world verified is interesting benchmark to look at here specifically. I know you", "tokens": [50900, 382, 291, 848, 11, 12731, 1002, 31197, 307, 1880, 18927, 281, 574, 412, 510, 4682, 13, 286, 458, 291, 51124], "temperature": 0.0, "avg_logprob": -0.10366150123231552, "compression_ratio": 1.7968253968253969, "no_speech_prob": 0.003415761748328805}, {"id": 90, "seek": 54108, "start": 556.2800000000001, "end": 561.24, "text": " mentioned sweet bench and a couple others. And so this is basically a computer control benchmark.", "tokens": [51124, 2835, 3844, 10638, 293, 257, 1916, 2357, 13, 400, 370, 341, 307, 1936, 257, 3820, 1969, 18927, 13, 51372], "temperature": 0.0, "avg_logprob": -0.10366150123231552, "compression_ratio": 1.7968253968253969, "no_speech_prob": 0.003415761748328805}, {"id": 91, "seek": 54108, "start": 561.24, "end": 566.36, "text": " So it looks at how well can the mini model just control a computer. And what we see here is a", "tokens": [51372, 407, 309, 1542, 412, 577, 731, 393, 264, 8382, 2316, 445, 1969, 257, 3820, 13, 400, 437, 321, 536, 510, 307, 257, 51628], "temperature": 0.0, "avg_logprob": -0.10366150123231552, "compression_ratio": 1.7968253968253969, "no_speech_prob": 0.003415761748328805}, {"id": 92, "seek": 56636, "start": 566.52, "end": 573.4, "text": " GPT 5.4 mini hits 72%. If you look back at GPT 5.4, the full version, it hits 75%. So it's", "tokens": [50372, 26039, 51, 1025, 13, 19, 8382, 8664, 18731, 6856, 759, 291, 574, 646, 412, 26039, 51, 1025, 13, 19, 11, 264, 1577, 3037, 11, 309, 8664, 9562, 6856, 407, 309, 311, 50716], "temperature": 0.0, "avg_logprob": -0.13159005928039552, "compression_ratio": 1.6493055555555556, "no_speech_prob": 0.004275008570402861}, {"id": 93, "seek": 56636, "start": 573.4, "end": 579.8000000000001, "text": " actually pretty close. And if you're thinking about the previous GPT 5 mini, that was only 42%.", "tokens": [50716, 767, 1238, 1998, 13, 400, 498, 291, 434, 1953, 466, 264, 3894, 26039, 51, 1025, 8382, 11, 300, 390, 787, 14034, 6856, 51036], "temperature": 0.0, "avg_logprob": -0.13159005928039552, "compression_ratio": 1.6493055555555556, "no_speech_prob": 0.004275008570402861}, {"id": 94, "seek": 56636, "start": 579.8000000000001, "end": 583.5600000000001, "text": " So pretty big jump, especially given that we're getting up there on this benchmark in terms of", "tokens": [51036, 407, 1238, 955, 3012, 11, 2318, 2212, 300, 321, 434, 1242, 493, 456, 322, 341, 18927, 294, 2115, 295, 51224], "temperature": 0.0, "avg_logprob": -0.13159005928039552, "compression_ratio": 1.6493055555555556, "no_speech_prob": 0.004275008570402861}, {"id": 95, "seek": 56636, "start": 583.5600000000001, "end": 588.84, "text": " saturating it. So all in all, pretty interesting release, the very lead, not very lead, but the", "tokens": [51224, 21160, 990, 309, 13, 407, 439, 294, 439, 11, 1238, 1880, 4374, 11, 264, 588, 1477, 11, 406, 588, 1477, 11, 457, 264, 51488], "temperature": 0.0, "avg_logprob": -0.13159005928039552, "compression_ratio": 1.6493055555555556, "no_speech_prob": 0.004275008570402861}, {"id": 96, "seek": 56636, "start": 588.84, "end": 593.4, "text": " very detail here really is that token efficiency question. What kind of workload are you going to", "tokens": [51488, 588, 2607, 510, 534, 307, 300, 14862, 10493, 1168, 13, 708, 733, 295, 20139, 366, 291, 516, 281, 51716], "temperature": 0.0, "avg_logprob": -0.13159005928039552, "compression_ratio": 1.6493055555555556, "no_speech_prob": 0.004275008570402861}, {"id": 97, "seek": 59340, "start": 593.4, "end": 597.64, "text": " use this for? That's going to determine, I don't want to call it like the total cost of ownership,", "tokens": [50364, 764, 341, 337, 30, 663, 311, 516, 281, 6997, 11, 286, 500, 380, 528, 281, 818, 309, 411, 264, 3217, 2063, 295, 15279, 11, 50576], "temperature": 0.0, "avg_logprob": -0.1333178642099021, "compression_ratio": 1.721830985915493, "no_speech_prob": 0.00822101067751646}, {"id": 98, "seek": 59340, "start": 597.64, "end": 602.04, "text": " because that's not quite the right metric here, but the total cost that you're exposed to in", "tokens": [50576, 570, 300, 311, 406, 1596, 264, 558, 20678, 510, 11, 457, 264, 3217, 2063, 300, 291, 434, 9495, 281, 294, 50796], "temperature": 0.0, "avg_logprob": -0.1333178642099021, "compression_ratio": 1.721830985915493, "no_speech_prob": 0.00822101067751646}, {"id": 99, "seek": 59340, "start": 602.04, "end": 606.84, "text": " the ROI, that's really becoming a key thing here. Right model for the right workload is just going", "tokens": [50796, 264, 49808, 11, 300, 311, 534, 5617, 257, 2141, 551, 510, 13, 1779, 2316, 337, 264, 558, 20139, 307, 445, 516, 51036], "temperature": 0.0, "avg_logprob": -0.1333178642099021, "compression_ratio": 1.721830985915493, "no_speech_prob": 0.00822101067751646}, {"id": 100, "seek": 59340, "start": 606.84, "end": 613.24, "text": " to be a critical dimension, at least for the next few months. Right. Yeah, I think these kinds of", "tokens": [51036, 281, 312, 257, 4924, 10139, 11, 412, 1935, 337, 264, 958, 1326, 2493, 13, 1779, 13, 865, 11, 286, 519, 613, 3685, 295, 51356], "temperature": 0.0, "avg_logprob": -0.1333178642099021, "compression_ratio": 1.721830985915493, "no_speech_prob": 0.00822101067751646}, {"id": 101, "seek": 59340, "start": 613.24, "end": 619.16, "text": " things showcase the fact that, you know, all these models kind of came out of the world of academia,", "tokens": [51356, 721, 20388, 264, 1186, 300, 11, 291, 458, 11, 439, 613, 5245, 733, 295, 1361, 484, 295, 264, 1002, 295, 28937, 11, 51652], "temperature": 0.0, "avg_logprob": -0.1333178642099021, "compression_ratio": 1.721830985915493, "no_speech_prob": 0.00822101067751646}, {"id": 102, "seek": 61916, "start": 619.16, "end": 626.4399999999999, "text": " right? And benchmarking is largely focused on capabilities on how accurate your model is.", "tokens": [50364, 558, 30, 400, 18927, 278, 307, 11611, 5178, 322, 10862, 322, 577, 8559, 428, 2316, 307, 13, 50728], "temperature": 0.0, "avg_logprob": -0.19541334312247194, "compression_ratio": 1.5726141078838174, "no_speech_prob": 0.005429878365248442}, {"id": 103, "seek": 61916, "start": 626.4399999999999, "end": 632.04, "text": " And so usually you don't highlight these kind of more practical concerns of how quickly can you", "tokens": [50728, 400, 370, 2673, 291, 500, 380, 5078, 613, 733, 295, 544, 8496, 7389, 295, 577, 2661, 393, 291, 51008], "temperature": 0.0, "avg_logprob": -0.19541334312247194, "compression_ratio": 1.5726141078838174, "no_speech_prob": 0.005429878365248442}, {"id": 104, "seek": 61916, "start": 632.04, "end": 639.16, "text": " finish a task, right? How cost effective are you like you do a task, how much dollars does it take?", "tokens": [51008, 2413, 257, 5633, 11, 558, 30, 1012, 2063, 4942, 366, 291, 411, 291, 360, 257, 5633, 11, 577, 709, 3808, 775, 309, 747, 30, 51364], "temperature": 0.0, "avg_logprob": -0.19541334312247194, "compression_ratio": 1.5726141078838174, "no_speech_prob": 0.005429878365248442}, {"id": 105, "seek": 61916, "start": 639.16, "end": 644.52, "text": " Wall clock time, right? We just don't get these numbers, at least, on the announcements. It's", "tokens": [51364, 9551, 7830, 565, 11, 558, 30, 492, 445, 500, 380, 483, 613, 3547, 11, 412, 1935, 11, 322, 264, 23785, 13, 467, 311, 51632], "temperature": 0.0, "avg_logprob": -0.19541334312247194, "compression_ratio": 1.5726141078838174, "no_speech_prob": 0.005429878365248442}, {"id": 106, "seek": 64452, "start": 644.52, "end": 649.24, "text": " honestly a bit surprising that's still the case, but it's a culture question, I suppose.", "tokens": [50364, 6095, 257, 857, 8830, 300, 311, 920, 264, 1389, 11, 457, 309, 311, 257, 3713, 1168, 11, 286, 7297, 13, 50600], "temperature": 0.0, "avg_logprob": -0.19921084710707268, "compression_ratio": 1.6222222222222222, "no_speech_prob": 0.013147559016942978}, {"id": 107, "seek": 64452, "start": 650.28, "end": 655.8, "text": " And as you said, we'll be discussing opening a strategy in just a bit in the business section.", "tokens": [50652, 400, 382, 291, 848, 11, 321, 603, 312, 10850, 5193, 257, 5206, 294, 445, 257, 857, 294, 264, 1606, 3541, 13, 50928], "temperature": 0.0, "avg_logprob": -0.19921084710707268, "compression_ratio": 1.6222222222222222, "no_speech_prob": 0.013147559016942978}, {"id": 108, "seek": 64452, "start": 655.8, "end": 660.36, "text": " Very much in line, we're on topic where we're like, we'll just charge more for our models,", "tokens": [50928, 4372, 709, 294, 1622, 11, 321, 434, 322, 4829, 689, 321, 434, 411, 11, 321, 603, 445, 4602, 544, 337, 527, 5245, 11, 51156], "temperature": 0.0, "avg_logprob": -0.19921084710707268, "compression_ratio": 1.6222222222222222, "no_speech_prob": 0.013147559016942978}, {"id": 109, "seek": 64452, "start": 660.36, "end": 667.4, "text": " but they're the best. And so people will follow it. And speaking of small models, next up,", "tokens": [51156, 457, 436, 434, 264, 1151, 13, 400, 370, 561, 486, 1524, 309, 13, 400, 4124, 295, 1359, 5245, 11, 958, 493, 11, 51508], "temperature": 0.0, "avg_logprob": -0.19921084710707268, "compression_ratio": 1.6222222222222222, "no_speech_prob": 0.013147559016942978}, {"id": 110, "seek": 66740, "start": 667.4, "end": 675.72, "text": " you've got me Straul. They have released their small four family of models under the Apache 2.0", "tokens": [50364, 291, 600, 658, 385, 12875, 425, 13, 814, 362, 4736, 641, 1359, 1451, 1605, 295, 5245, 833, 264, 46597, 568, 13, 15, 50780], "temperature": 0.0, "avg_logprob": -0.28604454810128493, "compression_ratio": 1.4536082474226804, "no_speech_prob": 0.01644926331937313}, {"id": 111, "seek": 66740, "start": 675.72, "end": 683.8, "text": " open source license. And it combines actually multiple things. So it has reasoning built in.", "tokens": [50780, 1269, 4009, 10476, 13, 400, 309, 29520, 767, 3866, 721, 13, 407, 309, 575, 21577, 3094, 294, 13, 51184], "temperature": 0.0, "avg_logprob": -0.28604454810128493, "compression_ratio": 1.4536082474226804, "no_speech_prob": 0.01644926331937313}, {"id": 112, "seek": 66740, "start": 683.8, "end": 690.36, "text": " It has multi-volonial capabilities built in and it has a genetic coding optimization. So they", "tokens": [51184, 467, 575, 4825, 12, 9646, 266, 831, 10862, 3094, 294, 293, 309, 575, 257, 12462, 17720, 19618, 13, 407, 436, 51512], "temperature": 0.0, "avg_logprob": -0.28604454810128493, "compression_ratio": 1.4536082474226804, "no_speech_prob": 0.01644926331937313}, {"id": 113, "seek": 69036, "start": 690.44, "end": 696.2, "text": " combining magistral, pickstrawl and devstrawl. It's also used as a mission of experts. So they've", "tokens": [50368, 21928, 48894, 2155, 11, 1888, 372, 5131, 75, 293, 1905, 372, 5131, 75, 13, 467, 311, 611, 1143, 382, 257, 4447, 295, 8572, 13, 407, 436, 600, 50656], "temperature": 0.0, "avg_logprob": -0.2488114528166942, "compression_ratio": 1.4390243902439024, "no_speech_prob": 0.012585566379129887}, {"id": 114, "seek": 69036, "start": 696.2, "end": 704.44, "text": " have a total of 119 billion parameters, but only six billion active parameters per token. That's", "tokens": [50656, 362, 257, 3217, 295, 2975, 24, 5218, 9834, 11, 457, 787, 2309, 5218, 4967, 9834, 680, 14862, 13, 663, 311, 51068], "temperature": 0.0, "avg_logprob": -0.2488114528166942, "compression_ratio": 1.4390243902439024, "no_speech_prob": 0.012585566379129887}, {"id": 115, "seek": 69036, "start": 704.44, "end": 712.36, "text": " quite small. You can fit it into probably one top end GPU. You know, it's actually quite affordable.", "tokens": [51068, 1596, 1359, 13, 509, 393, 3318, 309, 666, 1391, 472, 1192, 917, 18407, 13, 509, 458, 11, 309, 311, 767, 1596, 12028, 13, 51464], "temperature": 0.0, "avg_logprob": -0.2488114528166942, "compression_ratio": 1.4390243902439024, "no_speech_prob": 0.012585566379129887}, {"id": 116, "seek": 71236, "start": 713.0, "end": 721.88, "text": " So it looks like possibly a pretty strong model on in this category of smaller, faster,", "tokens": [50396, 407, 309, 1542, 411, 6264, 257, 1238, 2068, 2316, 322, 294, 341, 7719, 295, 4356, 11, 4663, 11, 50840], "temperature": 0.0, "avg_logprob": -0.17904048123635535, "compression_ratio": 1.7121771217712176, "no_speech_prob": 0.009283876977860928}, {"id": 117, "seek": 71236, "start": 722.6800000000001, "end": 727.24, "text": " cheaper, and open source. Yeah, challenge there is going to be, you know, if you get into", "tokens": [50880, 12284, 11, 293, 1269, 4009, 13, 865, 11, 3430, 456, 307, 516, 281, 312, 11, 291, 458, 11, 498, 291, 483, 666, 51108], "temperature": 0.0, "avg_logprob": -0.17904048123635535, "compression_ratio": 1.7121771217712176, "no_speech_prob": 0.009283876977860928}, {"id": 118, "seek": 71236, "start": 727.24, "end": 732.52, "text": " wanting to fine tune it, obviously, then, you know, now you're you're dealing with really a 120", "tokens": [51108, 7935, 281, 2489, 10864, 309, 11, 2745, 11, 550, 11, 291, 458, 11, 586, 291, 434, 291, 434, 6260, 365, 534, 257, 10411, 51372], "temperature": 0.0, "avg_logprob": -0.17904048123635535, "compression_ratio": 1.7121771217712176, "no_speech_prob": 0.009283876977860928}, {"id": 119, "seek": 71236, "start": 732.52, "end": 736.2, "text": " billion parameter model. And that's, you know, pain in the butt. But yeah, I mean, the only six", "tokens": [51372, 5218, 13075, 2316, 13, 400, 300, 311, 11, 291, 458, 11, 1822, 294, 264, 6660, 13, 583, 1338, 11, 286, 914, 11, 264, 787, 2309, 51556], "temperature": 0.0, "avg_logprob": -0.17904048123635535, "compression_ratio": 1.7121771217712176, "no_speech_prob": 0.009283876977860928}, {"id": 120, "seek": 71236, "start": 736.2, "end": 741.0, "text": " million active, I mean, this is really ultimately a bet that you're going to have kind of like", "tokens": [51556, 2459, 4967, 11, 286, 914, 11, 341, 307, 534, 6284, 257, 778, 300, 291, 434, 516, 281, 362, 733, 295, 411, 51796], "temperature": 0.0, "avg_logprob": -0.17904048123635535, "compression_ratio": 1.7121771217712176, "no_speech_prob": 0.009283876977860928}, {"id": 121, "seek": 74100, "start": 741.0, "end": 745.16, "text": " model sparsity in the sort of very aggressive sparsity ratio is going to perform a dense model or", "tokens": [50364, 2316, 637, 685, 507, 294, 264, 1333, 295, 588, 10762, 637, 685, 507, 8509, 307, 516, 281, 2042, 257, 18011, 2316, 420, 50572], "temperature": 0.0, "avg_logprob": -0.1312166859032744, "compression_ratio": 1.7167832167832169, "no_speech_prob": 0.0009827350731939077}, {"id": 122, "seek": 74100, "start": 745.16, "end": 749.88, "text": " something, you know, more traditional. And it's a bet as well as you say on the kind of hardware", "tokens": [50572, 746, 11, 291, 458, 11, 544, 5164, 13, 400, 309, 311, 257, 778, 382, 731, 382, 291, 584, 322, 264, 733, 295, 8837, 50808], "temperature": 0.0, "avg_logprob": -0.1312166859032744, "compression_ratio": 1.7167832167832169, "no_speech_prob": 0.0009827350731939077}, {"id": 123, "seek": 74100, "start": 749.88, "end": 755.4, "text": " that is available at least for inference on local machines. So yeah, it's quite interesting. It", "tokens": [50808, 300, 307, 2435, 412, 1935, 337, 38253, 322, 2654, 8379, 13, 407, 1338, 11, 309, 311, 1596, 1880, 13, 467, 51084], "temperature": 0.0, "avg_logprob": -0.1312166859032744, "compression_ratio": 1.7167832167832169, "no_speech_prob": 0.0009827350731939077}, {"id": 124, "seek": 74100, "start": 755.4, "end": 760.84, "text": " is a more aggressive kind of fewer active parameters per token type of play than we've seen before.", "tokens": [51084, 307, 257, 544, 10762, 733, 295, 13366, 4967, 9834, 680, 14862, 2010, 295, 862, 813, 321, 600, 1612, 949, 13, 51356], "temperature": 0.0, "avg_logprob": -0.1312166859032744, "compression_ratio": 1.7167832167832169, "no_speech_prob": 0.0009827350731939077}, {"id": 125, "seek": 74100, "start": 760.84, "end": 765.72, "text": " It's also kind of interesting. So you touched on the consolidation, right, of all these models under", "tokens": [51356, 467, 311, 611, 733, 295, 1880, 13, 407, 291, 9828, 322, 264, 39114, 11, 558, 11, 295, 439, 613, 5245, 833, 51600], "temperature": 0.0, "avg_logprob": -0.1312166859032744, "compression_ratio": 1.7167832167832169, "no_speech_prob": 0.0009827350731939077}, {"id": 126, "seek": 76572, "start": 765.72, "end": 771.32, "text": " a single, in a single model, right, reasoning, coding agents, multimodal, that's a pickstrial,", "tokens": [50364, 257, 2167, 11, 294, 257, 2167, 2316, 11, 558, 11, 21577, 11, 17720, 12554, 11, 32972, 378, 304, 11, 300, 311, 257, 1888, 372, 7111, 11, 50644], "temperature": 0.0, "avg_logprob": -0.14442155775377305, "compression_ratio": 1.673758865248227, "no_speech_prob": 0.00186769082210958}, {"id": 127, "seek": 76572, "start": 771.32, "end": 777.96, "text": " like looking at images and text and so on. So this is a weird play because historically,", "tokens": [50644, 411, 1237, 412, 5267, 293, 2487, 293, 370, 322, 13, 407, 341, 307, 257, 3657, 862, 570, 16180, 11, 50976], "temperature": 0.0, "avg_logprob": -0.14442155775377305, "compression_ratio": 1.673758865248227, "no_speech_prob": 0.00186769082210958}, {"id": 128, "seek": 76572, "start": 777.96, "end": 783.0, "text": " Mr. Alt has separated these, right. And so given they're collapsing them into one model just with", "tokens": [50976, 2221, 13, 15992, 575, 12005, 613, 11, 558, 13, 400, 370, 2212, 436, 434, 45339, 552, 666, 472, 2316, 445, 365, 51228], "temperature": 0.0, "avg_logprob": -0.14442155775377305, "compression_ratio": 1.673758865248227, "no_speech_prob": 0.00186769082210958}, {"id": 129, "seek": 76572, "start": 783.0, "end": 787.96, "text": " a reasoning effort dial, you could see that as a pretty big bet on where the industry is heading.", "tokens": [51228, 257, 21577, 4630, 5502, 11, 291, 727, 536, 300, 382, 257, 1238, 955, 778, 322, 689, 264, 3518, 307, 9864, 13, 51476], "temperature": 0.0, "avg_logprob": -0.14442155775377305, "compression_ratio": 1.673758865248227, "no_speech_prob": 0.00186769082210958}, {"id": 130, "seek": 76572, "start": 787.96, "end": 791.5600000000001, "text": " That is something that we've seen with, you know, obviously the O series of reasoning models", "tokens": [51476, 663, 307, 746, 300, 321, 600, 1612, 365, 11, 291, 458, 11, 2745, 264, 422, 2638, 295, 21577, 5245, 51656], "temperature": 0.0, "avg_logprob": -0.14442155775377305, "compression_ratio": 1.673758865248227, "no_speech_prob": 0.00186769082210958}, {"id": 131, "seek": 79156, "start": 792.0400000000001, "end": 797.48, "text": " with GPG 40 even starting as far back as that. If it's the case that we get positive transfer,", "tokens": [50388, 365, 26039, 38, 3356, 754, 2891, 382, 1400, 646, 382, 300, 13, 759, 309, 311, 264, 1389, 300, 321, 483, 3353, 5003, 11, 50660], "temperature": 0.0, "avg_logprob": -0.12138514470803988, "compression_ratio": 1.8782051282051282, "no_speech_prob": 0.00843607634305954}, {"id": 132, "seek": 79156, "start": 797.48, "end": 802.6800000000001, "text": " that's really what this is a bet on, right, that a model that is trained to do all these things will", "tokens": [50660, 300, 311, 534, 437, 341, 307, 257, 778, 322, 11, 558, 11, 300, 257, 2316, 300, 307, 8895, 281, 360, 439, 613, 721, 486, 50920], "temperature": 0.0, "avg_logprob": -0.12138514470803988, "compression_ratio": 1.8782051282051282, "no_speech_prob": 0.00843607634305954}, {"id": 133, "seek": 79156, "start": 802.6800000000001, "end": 806.5200000000001, "text": " do better at each individual thing because it's kind of getting cross training, right, the same", "tokens": [50920, 360, 1101, 412, 1184, 2609, 551, 570, 309, 311, 733, 295, 1242, 3278, 3097, 11, 558, 11, 264, 912, 51112], "temperature": 0.0, "avg_logprob": -0.12138514470803988, "compression_ratio": 1.8782051282051282, "no_speech_prob": 0.00843607634305954}, {"id": 134, "seek": 79156, "start": 806.5200000000001, "end": 811.5600000000001, "text": " way that you might want to do, you know, football and like ballet at the same time, rice hockey,", "tokens": [51112, 636, 300, 291, 1062, 528, 281, 360, 11, 291, 458, 11, 7346, 293, 411, 30512, 412, 264, 912, 565, 11, 5090, 22449, 11, 51364], "temperature": 0.0, "avg_logprob": -0.12138514470803988, "compression_ratio": 1.8782051282051282, "no_speech_prob": 0.00843607634305954}, {"id": 135, "seek": 79156, "start": 811.5600000000001, "end": 815.72, "text": " and that you get better at each different thing because of the combination. That's kind of the idea", "tokens": [51364, 293, 300, 291, 483, 1101, 412, 1184, 819, 551, 570, 295, 264, 6562, 13, 663, 311, 733, 295, 264, 1558, 51572], "temperature": 0.0, "avg_logprob": -0.12138514470803988, "compression_ratio": 1.8782051282051282, "no_speech_prob": 0.00843607634305954}, {"id": 136, "seek": 79156, "start": 815.72, "end": 820.5200000000001, "text": " here, right, that positive transfer that for so long was was really difficult to pin down. People", "tokens": [51572, 510, 11, 558, 11, 300, 3353, 5003, 300, 337, 370, 938, 390, 390, 534, 2252, 281, 5447, 760, 13, 3432, 51812], "temperature": 0.0, "avg_logprob": -0.12138514470803988, "compression_ratio": 1.8782051282051282, "no_speech_prob": 0.00843607634305954}, {"id": 137, "seek": 82052, "start": 820.52, "end": 824.04, "text": " were seeing negative transfer where the more stuff you train a model on, the worse it does on the", "tokens": [50364, 645, 2577, 3671, 5003, 689, 264, 544, 1507, 291, 3847, 257, 2316, 322, 11, 264, 5324, 309, 775, 322, 264, 50540], "temperature": 0.0, "avg_logprob": -0.13088949823379517, "compression_ratio": 1.7666666666666666, "no_speech_prob": 0.0016902119386941195}, {"id": 138, "seek": 82052, "start": 824.04, "end": 828.6, "text": " marginal additional task. We're now well into positive transfer territory. The extension of that", "tokens": [50540, 16885, 4497, 5633, 13, 492, 434, 586, 731, 666, 3353, 5003, 11360, 13, 440, 10320, 295, 300, 50768], "temperature": 0.0, "avg_logprob": -0.13088949823379517, "compression_ratio": 1.7666666666666666, "no_speech_prob": 0.0016902119386941195}, {"id": 139, "seek": 82052, "start": 828.6, "end": 832.76, "text": " to reasoning is quite interesting and implies that reasoning may, at least that they're betting,", "tokens": [50768, 281, 21577, 307, 1596, 1880, 293, 18779, 300, 21577, 815, 11, 412, 1935, 300, 436, 434, 34246, 11, 50976], "temperature": 0.0, "avg_logprob": -0.13088949823379517, "compression_ratio": 1.7666666666666666, "no_speech_prob": 0.0016902119386941195}, {"id": 140, "seek": 82052, "start": 832.76, "end": 837.64, "text": " that reasoning may eventually were already play a role in analyzing images more successfully for", "tokens": [50976, 300, 21577, 815, 4728, 645, 1217, 862, 257, 3090, 294, 23663, 5267, 544, 10727, 337, 51220], "temperature": 0.0, "avg_logprob": -0.13088949823379517, "compression_ratio": 1.7666666666666666, "no_speech_prob": 0.0016902119386941195}, {"id": 141, "seek": 82052, "start": 837.64, "end": 841.24, "text": " these kinds of models at this scale. So there's another piece here that's interesting, you know,", "tokens": [51220, 613, 3685, 295, 5245, 412, 341, 4373, 13, 407, 456, 311, 1071, 2522, 510, 300, 311, 1880, 11, 291, 458, 11, 51400], "temperature": 0.0, "avg_logprob": -0.13088949823379517, "compression_ratio": 1.7666666666666666, "no_speech_prob": 0.0016902119386941195}, {"id": 142, "seek": 82052, "start": 841.24, "end": 847.72, "text": " this efficiency claim. They say that small four is achieving scores that are like on par with GPT", "tokens": [51400, 341, 10493, 3932, 13, 814, 584, 300, 1359, 1451, 307, 19626, 13444, 300, 366, 411, 322, 971, 365, 26039, 51, 51724], "temperature": 0.0, "avg_logprob": -0.13088949823379517, "compression_ratio": 1.7666666666666666, "no_speech_prob": 0.0016902119386941195}, {"id": 143, "seek": 84772, "start": 847.72, "end": 854.76, "text": " OSS 120B on a bunch of benchmarks while generating like shorter outputs on at least one benchmark.", "tokens": [50364, 12731, 50, 10411, 33, 322, 257, 3840, 295, 43751, 1339, 17746, 411, 11639, 23930, 322, 412, 1935, 472, 18927, 13, 50716], "temperature": 0.0, "avg_logprob": -0.12101994468047556, "compression_ratio": 1.6955017301038062, "no_speech_prob": 0.007530849892646074}, {"id": 144, "seek": 84772, "start": 854.76, "end": 860.12, "text": " And so again, this is about that token efficiency question, right, we're sort of like back at that", "tokens": [50716, 400, 370, 797, 11, 341, 307, 466, 300, 14862, 10493, 1168, 11, 558, 11, 321, 434, 1333, 295, 411, 646, 412, 300, 50984], "temperature": 0.0, "avg_logprob": -0.12101994468047556, "compression_ratio": 1.6955017301038062, "no_speech_prob": 0.007530849892646074}, {"id": 145, "seek": 84772, "start": 860.12, "end": 866.52, "text": " very underrated metric of output length efficiency and the fact that you don't necessarily get more", "tokens": [50984, 588, 833, 5468, 20678, 295, 5598, 4641, 10493, 293, 264, 1186, 300, 291, 500, 380, 4725, 483, 544, 51304], "temperature": 0.0, "avg_logprob": -0.12101994468047556, "compression_ratio": 1.6955017301038062, "no_speech_prob": 0.007530849892646074}, {"id": 146, "seek": 84772, "start": 866.52, "end": 870.6, "text": " benefit by reasoning with more tokens, that's something we started pointing out when we looked", "tokens": [51304, 5121, 538, 21577, 365, 544, 22667, 11, 300, 311, 746, 321, 1409, 12166, 484, 562, 321, 2956, 51508], "temperature": 0.0, "avg_logprob": -0.12101994468047556, "compression_ratio": 1.6955017301038062, "no_speech_prob": 0.007530849892646074}, {"id": 147, "seek": 84772, "start": 870.6, "end": 874.36, "text": " early on at the kind of deep seek reasoning results, right, where it's like, yes, you do get this", "tokens": [51508, 2440, 322, 412, 264, 733, 295, 2452, 8075, 21577, 3542, 11, 558, 11, 689, 309, 311, 411, 11, 2086, 11, 291, 360, 483, 341, 51696], "temperature": 0.0, "avg_logprob": -0.12101994468047556, "compression_ratio": 1.6955017301038062, "no_speech_prob": 0.007530849892646074}, {"id": 148, "seek": 87436, "start": 874.36, "end": 879.72, "text": " positive uplift, but your value per token is actually like potentially going down. And we are", "tokens": [50364, 3353, 45407, 11, 457, 428, 2158, 680, 14862, 307, 767, 411, 7263, 516, 760, 13, 400, 321, 366, 50632], "temperature": 0.0, "avg_logprob": -0.15051708447144313, "compression_ratio": 1.778115501519757, "no_speech_prob": 0.0039826650172472}, {"id": 149, "seek": 87436, "start": 879.72, "end": 883.48, "text": " now, in fact, in that regime, we're seeing that very clearly. So this efficiency piece is really", "tokens": [50632, 586, 11, 294, 1186, 11, 294, 300, 13120, 11, 321, 434, 2577, 300, 588, 4448, 13, 407, 341, 10493, 2522, 307, 534, 50820], "temperature": 0.0, "avg_logprob": -0.15051708447144313, "compression_ratio": 1.778115501519757, "no_speech_prob": 0.0039826650172472}, {"id": 150, "seek": 87436, "start": 883.48, "end": 887.88, "text": " important because if people are going to use open source models, they're going to run them like by", "tokens": [50820, 1021, 570, 498, 561, 366, 516, 281, 764, 1269, 4009, 5245, 11, 436, 434, 516, 281, 1190, 552, 411, 538, 51040], "temperature": 0.0, "avg_logprob": -0.15051708447144313, "compression_ratio": 1.778115501519757, "no_speech_prob": 0.0039826650172472}, {"id": 151, "seek": 87436, "start": 887.88, "end": 893.88, "text": " far, the biggest use cases running these on whether you're own or other people's clusters to serve", "tokens": [51040, 1400, 11, 264, 3880, 764, 3331, 2614, 613, 322, 1968, 291, 434, 1065, 420, 661, 561, 311, 23313, 281, 4596, 51340], "temperature": 0.0, "avg_logprob": -0.15051708447144313, "compression_ratio": 1.778115501519757, "no_speech_prob": 0.0039826650172472}, {"id": 152, "seek": 87436, "start": 893.88, "end": 899.24, "text": " customers. And so, you know, like how good Mistral is at making this an efficient reasoner speaks", "tokens": [51340, 4581, 13, 400, 370, 11, 291, 458, 11, 411, 577, 665, 376, 468, 2155, 307, 412, 1455, 341, 364, 7148, 1778, 260, 10789, 51608], "temperature": 0.0, "avg_logprob": -0.15051708447144313, "compression_ratio": 1.778115501519757, "no_speech_prob": 0.0039826650172472}, {"id": 153, "seek": 87436, "start": 899.24, "end": 903.48, "text": " directly to your bottom line as the person who's going to be serving these or asking somebody else", "tokens": [51608, 3838, 281, 428, 2767, 1622, 382, 264, 954, 567, 311, 516, 281, 312, 8148, 613, 420, 3365, 2618, 1646, 51820], "temperature": 0.0, "avg_logprob": -0.15051708447144313, "compression_ratio": 1.778115501519757, "no_speech_prob": 0.0039826650172472}, {"id": 154, "seek": 90348, "start": 903.48, "end": 908.52, "text": " to serve them for you. So pretty interesting, you know, the fact that they're comparing to GPT OSS 120B,", "tokens": [50364, 281, 4596, 552, 337, 291, 13, 407, 1238, 1880, 11, 291, 458, 11, 264, 1186, 300, 436, 434, 15763, 281, 26039, 51, 12731, 50, 10411, 33, 11, 50616], "temperature": 0.0, "avg_logprob": -0.1295438005110702, "compression_ratio": 1.6923076923076923, "no_speech_prob": 0.00411042757332325}, {"id": 155, "seek": 90348, "start": 909.32, "end": 914.28, "text": " look, the space moves really fast. That is an old model at this point in open source terms.", "tokens": [50656, 574, 11, 264, 1901, 6067, 534, 2370, 13, 663, 307, 364, 1331, 2316, 412, 341, 935, 294, 1269, 4009, 2115, 13, 50904], "temperature": 0.0, "avg_logprob": -0.1295438005110702, "compression_ratio": 1.6923076923076923, "no_speech_prob": 0.00411042757332325}, {"id": 156, "seek": 90348, "start": 914.28, "end": 919.0, "text": " It's kind of like choosing your point of comparison pretty selectively, I would say here,", "tokens": [50904, 467, 311, 733, 295, 411, 10875, 428, 935, 295, 9660, 1238, 3048, 3413, 11, 286, 576, 584, 510, 11, 51140], "temperature": 0.0, "avg_logprob": -0.1295438005110702, "compression_ratio": 1.6923076923076923, "no_speech_prob": 0.00411042757332325}, {"id": 157, "seek": 90348, "start": 919.0, "end": 923.24, "text": " but they do a comparison that's reasonably favorable on point models as well. So it's an", "tokens": [51140, 457, 436, 360, 257, 9660, 300, 311, 23551, 29557, 322, 935, 5245, 382, 731, 13, 407, 309, 311, 364, 51352], "temperature": 0.0, "avg_logprob": -0.1295438005110702, "compression_ratio": 1.6923076923076923, "no_speech_prob": 0.00411042757332325}, {"id": 158, "seek": 90348, "start": 923.24, "end": 927.5600000000001, "text": " interesting play. I think, I mean, I'm a little concerned from Mistral. I have been for a long time,", "tokens": [51352, 1880, 862, 13, 286, 519, 11, 286, 914, 11, 286, 478, 257, 707, 5922, 490, 376, 468, 2155, 13, 286, 362, 668, 337, 257, 938, 565, 11, 51568], "temperature": 0.0, "avg_logprob": -0.1295438005110702, "compression_ratio": 1.6923076923076923, "no_speech_prob": 0.00411042757332325}, {"id": 159, "seek": 90348, "start": 927.5600000000001, "end": 932.04, "text": " obviously, don't know too much how this ends up playing out with the open source play, but here", "tokens": [51568, 2745, 11, 500, 380, 458, 886, 709, 577, 341, 5314, 493, 2433, 484, 365, 264, 1269, 4009, 862, 11, 457, 510, 51792], "temperature": 0.0, "avg_logprob": -0.1295438005110702, "compression_ratio": 1.6923076923076923, "no_speech_prob": 0.00411042757332325}, {"id": 160, "seek": 93204, "start": 932.04, "end": 936.04, "text": " they are. It's a reasonable model and the consolidation angle, that's a really big story here, right?", "tokens": [50364, 436, 366, 13, 467, 311, 257, 10585, 2316, 293, 264, 39114, 5802, 11, 300, 311, 257, 534, 955, 1657, 510, 11, 558, 30, 50564], "temperature": 0.0, "avg_logprob": -0.1136344778415275, "compression_ratio": 1.8023952095808384, "no_speech_prob": 0.0015554159181192517}, {"id": 161, "seek": 93204, "start": 936.04, "end": 941.24, "text": " If everybody is starting to consolidate, even open source players here around one model under one", "tokens": [50564, 759, 2201, 307, 2891, 281, 49521, 11, 754, 1269, 4009, 4150, 510, 926, 472, 2316, 833, 472, 50824], "temperature": 0.0, "avg_logprob": -0.1136344778415275, "compression_ratio": 1.8023952095808384, "no_speech_prob": 0.0015554159181192517}, {"id": 162, "seek": 93204, "start": 941.24, "end": 945.8, "text": " roof, that's a materially different story. It does not, by the way, extend necessarily to the world", "tokens": [50824, 8418, 11, 300, 311, 257, 2389, 2270, 819, 1657, 13, 467, 775, 406, 11, 538, 264, 636, 11, 10101, 4725, 281, 264, 1002, 51052], "temperature": 0.0, "avg_logprob": -0.1136344778415275, "compression_ratio": 1.8023952095808384, "no_speech_prob": 0.0015554159181192517}, {"id": 163, "seek": 93204, "start": 945.8, "end": 950.5999999999999, "text": " of agents, right? Sub agents may be smaller, cheaper models. This is more about, you know, if you're", "tokens": [51052, 295, 12554, 11, 558, 30, 8511, 12554, 815, 312, 4356, 11, 12284, 5245, 13, 639, 307, 544, 466, 11, 291, 458, 11, 498, 291, 434, 51292], "temperature": 0.0, "avg_logprob": -0.1136344778415275, "compression_ratio": 1.8023952095808384, "no_speech_prob": 0.0015554159181192517}, {"id": 164, "seek": 93204, "start": 950.5999999999999, "end": 955.8, "text": " looking for a highly performant model, I think of the main orchestrator, you're probably, it seems,", "tokens": [51292, 1237, 337, 257, 5405, 2042, 394, 2316, 11, 286, 519, 295, 264, 2135, 14161, 19802, 11, 291, 434, 1391, 11, 309, 2544, 11, 51552], "temperature": 0.0, "avg_logprob": -0.1136344778415275, "compression_ratio": 1.8023952095808384, "no_speech_prob": 0.0015554159181192517}, {"id": 165, "seek": 93204, "start": 955.8, "end": 960.1999999999999, "text": " you're probably going to be seeing, you know, models that that kind of put, put multiple capabilities", "tokens": [51552, 291, 434, 1391, 516, 281, 312, 2577, 11, 291, 458, 11, 5245, 300, 300, 733, 295, 829, 11, 829, 3866, 10862, 51772], "temperature": 0.0, "avg_logprob": -0.1136344778415275, "compression_ratio": 1.8023952095808384, "no_speech_prob": 0.0015554159181192517}, {"id": 166, "seek": 96020, "start": 960.2, "end": 965.5600000000001, "text": " under one roof. So something to watch out for. Right. I'll just quickly comment on the", "tokens": [50364, 833, 472, 8418, 13, 407, 746, 281, 1159, 484, 337, 13, 1779, 13, 286, 603, 445, 2661, 2871, 322, 264, 50632], "temperature": 0.0, "avg_logprob": -0.17250159320078398, "compression_ratio": 1.6016949152542372, "no_speech_prob": 0.016733404248952866}, {"id": 167, "seek": 96020, "start": 965.5600000000001, "end": 970.6800000000001, "text": " unification bit. It's better to say that it's partially a bet on what things are heading with", "tokens": [50632, 517, 3774, 857, 13, 467, 311, 1101, 281, 584, 300, 309, 311, 18886, 257, 778, 322, 437, 721, 366, 9864, 365, 50888], "temperature": 0.0, "avg_logprob": -0.17250159320078398, "compression_ratio": 1.6016949152542372, "no_speech_prob": 0.016733404248952866}, {"id": 168, "seek": 96020, "start": 970.6800000000001, "end": 977.72, "text": " a multimodal aspect. The fact that they have baked in reasoning and coding, I think, is a little more", "tokens": [50888, 257, 32972, 378, 304, 4171, 13, 440, 1186, 300, 436, 362, 19453, 294, 21577, 293, 17720, 11, 286, 519, 11, 307, 257, 707, 544, 51240], "temperature": 0.0, "avg_logprob": -0.17250159320078398, "compression_ratio": 1.6016949152542372, "no_speech_prob": 0.016733404248952866}, {"id": 169, "seek": 96020, "start": 977.72, "end": 983.08, "text": " just an indication of catching up with where things are these days. It used to be the case that", "tokens": [51240, 445, 364, 18877, 295, 16124, 493, 365, 689, 721, 366, 613, 1708, 13, 467, 1143, 281, 312, 264, 1389, 300, 51508], "temperature": 0.0, "avg_logprob": -0.17250159320078398, "compression_ratio": 1.6016949152542372, "no_speech_prob": 0.016733404248952866}, {"id": 170, "seek": 98308, "start": 983.08, "end": 990.2, "text": " you had a reasoning model, like, oh, free. And with deep seek R1, you trained a reasoning model", "tokens": [50364, 291, 632, 257, 21577, 2316, 11, 411, 11, 1954, 11, 1737, 13, 400, 365, 2452, 8075, 497, 16, 11, 291, 8895, 257, 21577, 2316, 50720], "temperature": 0.0, "avg_logprob": -0.19549445645643934, "compression_ratio": 1.7212389380530972, "no_speech_prob": 0.008042003959417343}, {"id": 171, "seek": 98308, "start": 990.2, "end": 996.2800000000001, "text": " and you had your base model. And what everyone moved to in 2335 is there's no reasoning model.", "tokens": [50720, 293, 291, 632, 428, 3096, 2316, 13, 400, 437, 1518, 4259, 281, 294, 6673, 8794, 307, 456, 311, 572, 21577, 2316, 13, 51024], "temperature": 0.0, "avg_logprob": -0.19549445645643934, "compression_ratio": 1.7212389380530972, "no_speech_prob": 0.008042003959417343}, {"id": 172, "seek": 98308, "start": 996.2800000000001, "end": 1004.44, "text": " Your model has reasoning baked in. And now with, with Sonnet, with GPT, really post training for", "tokens": [51024, 2260, 2316, 575, 21577, 19453, 294, 13, 400, 586, 365, 11, 365, 5185, 7129, 11, 365, 26039, 51, 11, 534, 2183, 3097, 337, 51432], "temperature": 0.0, "avg_logprob": -0.19549445645643934, "compression_ratio": 1.7212389380530972, "no_speech_prob": 0.008042003959417343}, {"id": 173, "seek": 98308, "start": 1004.44, "end": 1008.9200000000001, "text": " reasoning, it's been very clear that you should just train your model to be a good coder because that", "tokens": [51432, 21577, 11, 309, 311, 668, 588, 1850, 300, 291, 820, 445, 3847, 428, 2316, 281, 312, 257, 665, 17656, 260, 570, 300, 51656], "temperature": 0.0, "avg_logprob": -0.19549445645643934, "compression_ratio": 1.7212389380530972, "no_speech_prob": 0.008042003959417343}, {"id": 174, "seek": 100892, "start": 1008.9200000000001, "end": 1015.08, "text": " makes it a smarter model in general. So this is a mix of catching up where our rings are at", "tokens": [50364, 1669, 309, 257, 20294, 2316, 294, 2674, 13, 407, 341, 307, 257, 2890, 295, 16124, 493, 689, 527, 11136, 366, 412, 50672], "temperature": 0.0, "avg_logprob": -0.1865533098578453, "compression_ratio": 1.471502590673575, "no_speech_prob": 0.00037221284583210945}, {"id": 175, "seek": 100892, "start": 1015.08, "end": 1022.6800000000001, "text": " and also adapting that multi model capability, which is could be interpreted in several ways.", "tokens": [50672, 293, 611, 34942, 300, 4825, 2316, 13759, 11, 597, 307, 727, 312, 26749, 294, 2940, 2098, 13, 51052], "temperature": 0.0, "avg_logprob": -0.1865533098578453, "compression_ratio": 1.471502590673575, "no_speech_prob": 0.00037221284583210945}, {"id": 176, "seek": 100892, "start": 1024.2, "end": 1034.28, "text": " Next up, Meta's Manus launches my computer to turn your Mac into an AI agent. So this is something", "tokens": [51128, 3087, 493, 11, 6377, 64, 311, 2458, 301, 31841, 452, 3820, 281, 1261, 428, 5707, 666, 364, 7318, 9461, 13, 407, 341, 307, 746, 51632], "temperature": 0.0, "avg_logprob": -0.1865533098578453, "compression_ratio": 1.471502590673575, "no_speech_prob": 0.00037221284583210945}, {"id": 177, "seek": 103428, "start": 1034.28, "end": 1040.92, "text": " you can install and launch on your computer. And it's effectively like having a little", "tokens": [50364, 291, 393, 3625, 293, 4025, 322, 428, 3820, 13, 400, 309, 311, 8659, 411, 1419, 257, 707, 50696], "temperature": 0.0, "avg_logprob": -0.16861732968364854, "compression_ratio": 1.6486486486486487, "no_speech_prob": 0.0067201717756688595}, {"id": 178, "seek": 103428, "start": 1040.92, "end": 1047.3999999999999, "text": " open claw, I guess, on your computer. So it can execute command line instructions that", "tokens": [50696, 1269, 32019, 11, 286, 2041, 11, 322, 428, 3820, 13, 407, 309, 393, 14483, 5622, 1622, 9415, 300, 51020], "temperature": 0.0, "avg_logprob": -0.16861732968364854, "compression_ratio": 1.6486486486486487, "no_speech_prob": 0.0067201717756688595}, {"id": 179, "seek": 103428, "start": 1047.3999999999999, "end": 1054.44, "text": " sit interact with computers. So it's very much, I think we've seen this happening more and more", "tokens": [51020, 1394, 4648, 365, 10807, 13, 407, 309, 311, 588, 709, 11, 286, 519, 321, 600, 1612, 341, 2737, 544, 293, 544, 51372], "temperature": 0.0, "avg_logprob": -0.16861732968364854, "compression_ratio": 1.6486486486486487, "no_speech_prob": 0.0067201717756688595}, {"id": 180, "seek": 103428, "start": 1054.44, "end": 1060.44, "text": " where various organizations are shipping open claw, ask things where you have an agent that just", "tokens": [51372, 689, 3683, 6150, 366, 14122, 1269, 32019, 11, 1029, 721, 689, 291, 362, 364, 9461, 300, 445, 51672], "temperature": 0.0, "avg_logprob": -0.16861732968364854, "compression_ratio": 1.6486486486486487, "no_speech_prob": 0.0067201717756688595}, {"id": 181, "seek": 106044, "start": 1060.44, "end": 1067.72, "text": " lives somewhere and you can tell it to do stuff. And it is your like assistant or your AI or whatever.", "tokens": [50364, 2909, 4079, 293, 291, 393, 980, 309, 281, 360, 1507, 13, 400, 309, 307, 428, 411, 10994, 420, 428, 7318, 420, 2035, 13, 50728], "temperature": 0.0, "avg_logprob": -0.2065233188132717, "compression_ratio": 1.5, "no_speech_prob": 0.0030027551110833883}, {"id": 182, "seek": 106044, "start": 1068.28, "end": 1075.72, "text": " This appears to be another instantiation of that similar also to perplexities announcement of", "tokens": [50756, 639, 7038, 281, 312, 1071, 9836, 6642, 295, 300, 2531, 611, 281, 680, 18945, 1088, 12847, 295, 51128], "temperature": 0.0, "avg_logprob": -0.2065233188132717, "compression_ratio": 1.5, "no_speech_prob": 0.0030027551110833883}, {"id": 183, "seek": 106044, "start": 1075.72, "end": 1083.56, "text": " what was it like perplexity computer. Yeah. So very much in line with that. Oh, you mean you were", "tokens": [51128, 437, 390, 309, 411, 680, 18945, 507, 3820, 13, 865, 13, 407, 588, 709, 294, 1622, 365, 300, 13, 876, 11, 291, 914, 291, 645, 51520], "temperature": 0.0, "avg_logprob": -0.2065233188132717, "compression_ratio": 1.5, "no_speech_prob": 0.0030027551110833883}, {"id": 184, "seek": 108356, "start": 1083.56, "end": 1089.48, "text": " enabled to remember like the sixth new open claw variant that got launched by a company in the", "tokens": [50364, 15172, 281, 1604, 411, 264, 15102, 777, 1269, 32019, 17501, 300, 658, 8730, 538, 257, 2237, 294, 264, 50660], "temperature": 0.0, "avg_logprob": -0.19957716853722282, "compression_ratio": 1.6258503401360545, "no_speech_prob": 0.0005930213374085724}, {"id": 185, "seek": 108356, "start": 1089.48, "end": 1093.8, "text": " last two weeks. Yeah, it's crazy, right? We're really seeing more and more of this like", "tokens": [50660, 1036, 732, 3259, 13, 865, 11, 309, 311, 3219, 11, 558, 30, 492, 434, 534, 2577, 544, 293, 544, 295, 341, 411, 50876], "temperature": 0.0, "avg_logprob": -0.19957716853722282, "compression_ratio": 1.6258503401360545, "no_speech_prob": 0.0005930213374085724}, {"id": 186, "seek": 108356, "start": 1093.8, "end": 1098.6799999999998, "text": " pylon right from all these all these different competitors in this space. And this is a land grab", "tokens": [50876, 280, 34926, 558, 490, 439, 613, 439, 613, 819, 18333, 294, 341, 1901, 13, 400, 341, 307, 257, 2117, 4444, 51120], "temperature": 0.0, "avg_logprob": -0.19957716853722282, "compression_ratio": 1.6258503401360545, "no_speech_prob": 0.0005930213374085724}, {"id": 187, "seek": 108356, "start": 1098.6799999999998, "end": 1103.72, "text": " like Magnum mistake. This is the the sort of man, I don't call it the scramble for Africa moment,", "tokens": [51120, 411, 19664, 449, 6146, 13, 639, 307, 264, 264, 1333, 295, 587, 11, 286, 500, 380, 818, 309, 264, 795, 48382, 337, 7349, 1623, 11, 51372], "temperature": 0.0, "avg_logprob": -0.19957716853722282, "compression_ratio": 1.6258503401360545, "no_speech_prob": 0.0005930213374085724}, {"id": 188, "seek": 108356, "start": 1103.72, "end": 1108.36, "text": " but the you know, historically equivalent of that in this space with fewer controversial overtones.", "tokens": [51372, 457, 264, 291, 458, 11, 16180, 10344, 295, 300, 294, 341, 1901, 365, 13366, 17323, 670, 46272, 13, 51604], "temperature": 0.0, "avg_logprob": -0.19957716853722282, "compression_ratio": 1.6258503401360545, "no_speech_prob": 0.0005930213374085724}, {"id": 189, "seek": 110836, "start": 1108.3600000000001, "end": 1113.4, "text": " This is the moment where people are realizing, hey, you know what? We need to get on people's", "tokens": [50364, 639, 307, 264, 1623, 689, 561, 366, 16734, 11, 4177, 11, 291, 458, 437, 30, 492, 643, 281, 483, 322, 561, 311, 50616], "temperature": 0.0, "avg_logprob": -0.14020155143737792, "compression_ratio": 1.7282608695652173, "no_speech_prob": 0.0017088237218558788}, {"id": 190, "seek": 110836, "start": 1113.4, "end": 1119.8000000000002, "text": " local machines, right? We have to get some kind of piece of that pie because so much of what's", "tokens": [50616, 2654, 8379, 11, 558, 30, 492, 362, 281, 483, 512, 733, 295, 2522, 295, 300, 1730, 570, 370, 709, 295, 437, 311, 50936], "temperature": 0.0, "avg_logprob": -0.14020155143737792, "compression_ratio": 1.7282608695652173, "no_speech_prob": 0.0017088237218558788}, {"id": 191, "seek": 110836, "start": 1119.8000000000002, "end": 1126.0400000000002, "text": " what's happening right now is that we're trying to like grab onto people's like agentic like the", "tokens": [50936, 437, 311, 2737, 558, 586, 307, 300, 321, 434, 1382, 281, 411, 4444, 3911, 561, 311, 411, 9461, 299, 411, 264, 51248], "temperature": 0.0, "avg_logprob": -0.14020155143737792, "compression_ratio": 1.7282608695652173, "no_speech_prob": 0.0017088237218558788}, {"id": 192, "seek": 110836, "start": 1126.0400000000002, "end": 1129.8000000000002, "text": " the kind of agentic runtime layer. In fact, in video, we'll talk about this in a minute with", "tokens": [51248, 264, 733, 295, 9461, 299, 34474, 4583, 13, 682, 1186, 11, 294, 960, 11, 321, 603, 751, 466, 341, 294, 257, 3456, 365, 51436], "temperature": 0.0, "avg_logprob": -0.14020155143737792, "compression_ratio": 1.7282608695652173, "no_speech_prob": 0.0017088237218558788}, {"id": 193, "seek": 110836, "start": 1129.8000000000002, "end": 1134.5200000000002, "text": " Neemoclaw. It's the same thing. Everybody's trying to get on to like what is what is the substrate", "tokens": [51436, 1734, 443, 905, 5901, 13, 467, 311, 264, 912, 551, 13, 7646, 311, 1382, 281, 483, 322, 281, 411, 437, 307, 437, 307, 264, 27585, 51672], "temperature": 0.0, "avg_logprob": -0.14020155143737792, "compression_ratio": 1.7282608695652173, "no_speech_prob": 0.0017088237218558788}, {"id": 194, "seek": 113452, "start": 1134.52, "end": 1138.92, "text": " on which agents are going to run? Can I get my dirty little hands on that and and turn that into", "tokens": [50364, 322, 597, 12554, 366, 516, 281, 1190, 30, 1664, 286, 483, 452, 9360, 707, 2377, 322, 300, 293, 293, 1261, 300, 666, 50584], "temperature": 0.0, "avg_logprob": -0.13884061730426292, "compression_ratio": 1.6538461538461537, "no_speech_prob": 0.0024734982289373875}, {"id": 195, "seek": 113452, "start": 1138.92, "end": 1144.92, "text": " part of my market? And in this case, the menace play is quite interesting and it's aging well,", "tokens": [50584, 644, 295, 452, 2142, 30, 400, 294, 341, 1389, 11, 264, 1706, 617, 862, 307, 1596, 1880, 293, 309, 311, 19090, 731, 11, 50884], "temperature": 0.0, "avg_logprob": -0.13884061730426292, "compression_ratio": 1.6538461538461537, "no_speech_prob": 0.0024734982289373875}, {"id": 196, "seek": 113452, "start": 1144.92, "end": 1149.32, "text": " right? I mean, menace was an in it like totally independent company before the acquisition.", "tokens": [50884, 558, 30, 286, 914, 11, 1706, 617, 390, 364, 294, 309, 411, 3879, 6695, 2237, 949, 264, 21668, 13, 51104], "temperature": 0.0, "avg_logprob": -0.13884061730426292, "compression_ratio": 1.6538461538461537, "no_speech_prob": 0.0024734982289373875}, {"id": 197, "seek": 113452, "start": 1149.32, "end": 1156.04, "text": " And really the play here is meta extending its reach into that that local OS layer for the first", "tokens": [51104, 400, 534, 264, 862, 510, 307, 19616, 24360, 1080, 2524, 666, 300, 300, 2654, 12731, 4583, 337, 264, 700, 51440], "temperature": 0.0, "avg_logprob": -0.13884061730426292, "compression_ratio": 1.6538461538461537, "no_speech_prob": 0.0024734982289373875}, {"id": 198, "seek": 113452, "start": 1156.04, "end": 1160.2, "text": " time. It's this is the territory historically of you know, your apples, your Microsoft, your", "tokens": [51440, 565, 13, 467, 311, 341, 307, 264, 11360, 16180, 295, 291, 458, 11, 428, 16814, 11, 428, 8116, 11, 428, 51648], "temperature": 0.0, "avg_logprob": -0.13884061730426292, "compression_ratio": 1.6538461538461537, "no_speech_prob": 0.0024734982289373875}, {"id": 199, "seek": 116020, "start": 1160.2, "end": 1165.0, "text": " Google's. That's the war they're entering. We haven't seen meta do that before, right? We just", "tokens": [50364, 3329, 311, 13, 663, 311, 264, 1516, 436, 434, 11104, 13, 492, 2378, 380, 1612, 19616, 360, 300, 949, 11, 558, 30, 492, 445, 50604], "temperature": 0.0, "avg_logprob": -0.13158124020631365, "compression_ratio": 1.7867867867867868, "no_speech_prob": 0.007885085418820381}, {"id": 200, "seek": 116020, "start": 1165.0, "end": 1169.8, "text": " haven't seen them try to play in the operating system game. This is a really interesting way for", "tokens": [50604, 2378, 380, 1612, 552, 853, 281, 862, 294, 264, 7447, 1185, 1216, 13, 639, 307, 257, 534, 1880, 636, 337, 50844], "temperature": 0.0, "avg_logprob": -0.13158124020631365, "compression_ratio": 1.7867867867867868, "no_speech_prob": 0.007885085418820381}, {"id": 201, "seek": 116020, "start": 1169.8, "end": 1175.48, "text": " them to vector into a completely different market using resources that you know, maybe for the first", "tokens": [50844, 552, 281, 8062, 666, 257, 2584, 819, 2142, 1228, 3593, 300, 291, 458, 11, 1310, 337, 264, 700, 51128], "temperature": 0.0, "avg_logprob": -0.13158124020631365, "compression_ratio": 1.7867867867867868, "no_speech_prob": 0.007885085418820381}, {"id": 202, "seek": 116020, "start": 1175.48, "end": 1179.32, "text": " time they have like a credible, whether you call it advantage or not, they have a credible play here.", "tokens": [51128, 565, 436, 362, 411, 257, 32757, 11, 1968, 291, 818, 309, 5002, 420, 406, 11, 436, 362, 257, 32757, 862, 510, 13, 51320], "temperature": 0.0, "avg_logprob": -0.13158124020631365, "compression_ratio": 1.7867867867867868, "no_speech_prob": 0.007885085418820381}, {"id": 203, "seek": 116020, "start": 1179.32, "end": 1183.64, "text": " So pretty interesting. And again, meta classic history of buying their way into competing, right? We", "tokens": [51320, 407, 1238, 1880, 13, 400, 797, 11, 19616, 7230, 2503, 295, 6382, 641, 636, 666, 15439, 11, 558, 30, 492, 51536], "temperature": 0.0, "avg_logprob": -0.13158124020631365, "compression_ratio": 1.7867867867867868, "no_speech_prob": 0.007885085418820381}, {"id": 204, "seek": 116020, "start": 1183.64, "end": 1188.3600000000001, "text": " saw this with WhatsApp with Instagram like it, it never ends. This is their play in that direction.", "tokens": [51536, 1866, 341, 365, 30513, 365, 5281, 411, 309, 11, 309, 1128, 5314, 13, 639, 307, 641, 862, 294, 300, 3513, 13, 51772], "temperature": 0.0, "avg_logprob": -0.13158124020631365, "compression_ratio": 1.7867867867867868, "no_speech_prob": 0.007885085418820381}, {"id": 205, "seek": 118836, "start": 1188.3600000000001, "end": 1191.0800000000002, "text": " So, menace, you know, maybe aging well, we've got to see what comes of this.", "tokens": [50364, 407, 11, 1706, 617, 11, 291, 458, 11, 1310, 19090, 731, 11, 321, 600, 658, 281, 536, 437, 1487, 295, 341, 13, 50500], "temperature": 0.0, "avg_logprob": -0.28844761646400063, "compression_ratio": 1.3289473684210527, "no_speech_prob": 0.016713351011276245}, {"id": 206, "seek": 0, "start": 1194.32, "end": 1200.4, "text": " came out I believe last year initially with a cloud based agent but you could assign to go do", "tokens": [51080, 1361, 484, 286, 1697, 1036, 1064, 9105, 365, 257, 4588, 2361, 9461, 457, 291, 727, 6269, 281, 352, 360, 51384], "temperature": 0.0, "avg_logprob": -0.20028368291400728, "compression_ratio": 1.6594202898550725, "no_speech_prob": 0.1849641054868698}, {"id": 207, "seek": 0, "start": 1200.4, "end": 1205.76, "text": " stuff so this is them extending to your local computer right now you can I think get this", "tokens": [51384, 1507, 370, 341, 307, 552, 24360, 281, 428, 2654, 3820, 558, 586, 291, 393, 286, 519, 483, 341, 51652], "temperature": 0.0, "avg_logprob": -0.20028368291400728, "compression_ratio": 1.6594202898550725, "no_speech_prob": 0.1849641054868698}, {"id": 208, "seek": 2576, "start": 1205.76, "end": 1213.12, "text": " for silicon based max the other aspect of this by way is not just open claw it's the core work", "tokens": [50364, 337, 22848, 2361, 11469, 264, 661, 4171, 295, 341, 538, 636, 307, 406, 445, 1269, 32019, 309, 311, 264, 4965, 589, 50732], "temperature": 0.0, "avg_logprob": -0.18503537217339316, "compression_ratio": 1.6936936936936937, "no_speech_prob": 0.012284253723919392}, {"id": 209, "seek": 2576, "start": 1213.12, "end": 1220.4, "text": " the codex angle where they most of the blockpost actually is highlighting like let the agent go", "tokens": [50732, 264, 3089, 87, 5802, 689, 436, 881, 295, 264, 3461, 23744, 767, 307, 26551, 411, 718, 264, 9461, 352, 51096], "temperature": 0.0, "avg_logprob": -0.18503537217339316, "compression_ratio": 1.6936936936936937, "no_speech_prob": 0.012284253723919392}, {"id": 210, "seek": 2576, "start": 1220.4, "end": 1226.48, "text": " to computer and organize files and do things that are very much cloud code or or now core work", "tokens": [51096, 281, 3820, 293, 13859, 7098, 293, 360, 721, 300, 366, 588, 709, 4588, 3089, 420, 420, 586, 4965, 589, 51400], "temperature": 0.0, "avg_logprob": -0.18503537217339316, "compression_ratio": 1.6936936936936937, "no_speech_prob": 0.012284253723919392}, {"id": 211, "seek": 2576, "start": 1226.48, "end": 1233.2, "text": " ask and we kind of tell it to do stuff from anywhere at any point is an aspect of it's the", "tokens": [51400, 1029, 293, 321, 733, 295, 980, 309, 281, 360, 1507, 490, 4992, 412, 604, 935, 307, 364, 4171, 295, 309, 311, 264, 51736], "temperature": 0.0, "avg_logprob": -0.18503537217339316, "compression_ratio": 1.6936936936936937, "no_speech_prob": 0.012284253723919392}, {"id": 212, "seek": 5320, "start": 1233.2, "end": 1240.56, "text": " opaquaspect but I think the real land grab is for that core work type like have an agent do stuff", "tokens": [50364, 999, 23761, 296, 1043, 457, 286, 519, 264, 957, 2117, 4444, 307, 337, 300, 4965, 589, 2010, 411, 362, 364, 9461, 360, 1507, 50732], "temperature": 0.0, "avg_logprob": -0.19650775761060094, "compression_ratio": 1.6814159292035398, "no_speech_prob": 0.003950241953134537}, {"id": 213, "seek": 5320, "start": 1240.56, "end": 1248.8, "text": " for you which is now like everywhere in coding but I think what or on topic and now open AI", "tokens": [50732, 337, 291, 597, 307, 586, 411, 5315, 294, 17720, 457, 286, 519, 437, 420, 322, 4829, 293, 586, 1269, 7318, 51144], "temperature": 0.0, "avg_logprob": -0.19650775761060094, "compression_ratio": 1.6814159292035398, "no_speech_prob": 0.003950241953134537}, {"id": 214, "seek": 5320, "start": 1248.8, "end": 1255.12, "text": " and now everyone is realizing is these agents can like do a whole bunch of stuff and that people", "tokens": [51144, 293, 586, 1518, 307, 16734, 307, 613, 12554, 393, 411, 360, 257, 1379, 3840, 295, 1507, 293, 300, 561, 51460], "temperature": 0.0, "avg_logprob": -0.19650775761060094, "compression_ratio": 1.6814159292035398, "no_speech_prob": 0.003950241953134537}, {"id": 215, "seek": 5320, "start": 1255.12, "end": 1261.92, "text": " haven't adopted to do yet and speaking of open claw next up and video has announced Nima claw", "tokens": [51460, 2378, 380, 12175, 281, 360, 1939, 293, 4124, 295, 1269, 32019, 958, 493, 293, 960, 575, 7548, 426, 4775, 32019, 51800], "temperature": 0.0, "avg_logprob": -0.19650775761060094, "compression_ratio": 1.6814159292035398, "no_speech_prob": 0.003950241953134537}, {"id": 216, "seek": 8192, "start": 1261.92, "end": 1269.12, "text": " as part of their announcements at gtc which is a little bit of funny this is a stack for the", "tokens": [50364, 382, 644, 295, 641, 23785, 412, 290, 83, 66, 597, 307, 257, 707, 857, 295, 4074, 341, 307, 257, 8630, 337, 264, 50724], "temperature": 0.0, "avg_logprob": -0.20661042173477737, "compression_ratio": 1.6846846846846846, "no_speech_prob": 0.003327314741909504}, {"id": 217, "seek": 8192, "start": 1269.12, "end": 1275.76, "text": " open claw agent platforms that allows you to install and video name-atron models we just discussed", "tokens": [50724, 1269, 32019, 9461, 9473, 300, 4045, 291, 281, 3625, 293, 960, 1315, 12, 267, 2044, 5245, 321, 445, 7152, 51056], "temperature": 0.0, "avg_logprob": -0.20661042173477737, "compression_ratio": 1.6846846846846846, "no_speech_prob": 0.003327314741909504}, {"id": 218, "seek": 8192, "start": 1275.76, "end": 1282.48, "text": " this latest name-atron model last week and there's a new and video open shell runtime you", "tokens": [51056, 341, 6792, 1315, 12, 267, 2044, 2316, 1036, 1243, 293, 456, 311, 257, 777, 293, 960, 1269, 8720, 34474, 291, 51392], "temperature": 0.0, "avg_logprob": -0.20661042173477737, "compression_ratio": 1.6846846846846846, "no_speech_prob": 0.003327314741909504}, {"id": 219, "seek": 8192, "start": 1282.48, "end": 1289.6, "text": " install both of those in a single command and you get privacy and security controls baked in", "tokens": [51392, 3625, 1293, 295, 729, 294, 257, 2167, 5622, 293, 291, 483, 11427, 293, 3825, 9003, 19453, 294, 51748], "temperature": 0.0, "avg_logprob": -0.20661042173477737, "compression_ratio": 1.6846846846846846, "no_speech_prob": 0.003327314741909504}, {"id": 220, "seek": 10960, "start": 1289.6, "end": 1297.04, "text": " making it possible to have you know more more confidence in running one of these things we've", "tokens": [50364, 1455, 309, 1944, 281, 362, 291, 458, 544, 544, 6687, 294, 2614, 472, 295, 613, 721, 321, 600, 50736], "temperature": 0.0, "avg_logprob": -0.1389279385052976, "compression_ratio": 1.6768558951965065, "no_speech_prob": 0.0009927309583872557}, {"id": 221, "seek": 10960, "start": 1297.04, "end": 1302.72, "text": " seen many stories of open claw go in raw it's right absolute confidence yeah so open shell provides", "tokens": [50736, 1612, 867, 3676, 295, 1269, 32019, 352, 294, 8936, 309, 311, 558, 8236, 6687, 1338, 370, 1269, 8720, 6417, 51020], "temperature": 0.0, "avg_logprob": -0.1389279385052976, "compression_ratio": 1.6768558951965065, "no_speech_prob": 0.0009927309583872557}, {"id": 222, "seek": 10960, "start": 1302.72, "end": 1310.0, "text": " an isolated sandbox that enforces policy-based security network and privacy guard rails for", "tokens": [51020, 364, 14621, 42115, 300, 25495, 887, 3897, 12, 6032, 3825, 3209, 293, 11427, 6290, 27649, 337, 51384], "temperature": 0.0, "avg_logprob": -0.1389279385052976, "compression_ratio": 1.6768558951965065, "no_speech_prob": 0.0009927309583872557}, {"id": 223, "seek": 10960, "start": 1310.0, "end": 1317.12, "text": " agents seems like a good idea if you are to do one of these agents maybe install them in a sandbox", "tokens": [51384, 12554, 2544, 411, 257, 665, 1558, 498, 291, 366, 281, 360, 472, 295, 613, 12554, 1310, 3625, 552, 294, 257, 42115, 51740], "temperature": 0.0, "avg_logprob": -0.1389279385052976, "compression_ratio": 1.6768558951965065, "no_speech_prob": 0.0009927309583872557}, {"id": 224, "seek": 13712, "start": 1317.12, "end": 1322.8, "text": " where you can control them and they don't go rogue and you know take over the world yeah and one of", "tokens": [50364, 689, 291, 393, 1969, 552, 293, 436, 500, 380, 352, 39100, 293, 291, 458, 747, 670, 264, 1002, 1338, 293, 472, 295, 50648], "temperature": 0.0, "avg_logprob": -0.09657411836087704, "compression_ratio": 2.066901408450704, "no_speech_prob": 0.004746066872030497}, {"id": 225, "seek": 13712, "start": 1322.8, "end": 1327.52, "text": " the the key dimensions here you know you think about what does sandbox mean you know how do these", "tokens": [50648, 264, 264, 2141, 12819, 510, 291, 458, 291, 519, 466, 437, 775, 42115, 914, 291, 458, 577, 360, 613, 50884], "temperature": 0.0, "avg_logprob": -0.09657411836087704, "compression_ratio": 2.066901408450704, "no_speech_prob": 0.004746066872030497}, {"id": 226, "seek": 13712, "start": 1327.52, "end": 1332.4, "text": " things work typically a lot of these things focus on what is the model that is on the running on", "tokens": [50884, 721, 589, 5850, 257, 688, 295, 613, 721, 1879, 322, 437, 307, 264, 2316, 300, 307, 322, 264, 2614, 322, 51128], "temperature": 0.0, "avg_logprob": -0.09657411836087704, "compression_ratio": 2.066901408450704, "no_speech_prob": 0.004746066872030497}, {"id": 227, "seek": 13712, "start": 1332.4, "end": 1336.88, "text": " the cloud and what are the models for model that's running locally on your machine right so you", "tokens": [51128, 264, 4588, 293, 437, 366, 264, 5245, 337, 2316, 300, 311, 2614, 16143, 322, 428, 3479, 558, 370, 291, 51352], "temperature": 0.0, "avg_logprob": -0.09657411836087704, "compression_ratio": 2.066901408450704, "no_speech_prob": 0.004746066872030497}, {"id": 228, "seek": 13712, "start": 1336.88, "end": 1340.64, "text": " imagine you might not want the model that runs locally on your machine that's actually looking at", "tokens": [51352, 3811, 291, 1062, 406, 528, 264, 2316, 300, 6676, 16143, 322, 428, 3479, 300, 311, 767, 1237, 412, 51540], "temperature": 0.0, "avg_logprob": -0.09657411836087704, "compression_ratio": 2.066901408450704, "no_speech_prob": 0.004746066872030497}, {"id": 229, "seek": 13712, "start": 1340.64, "end": 1345.6, "text": " your own intimate files to have direct access to the internet right so that's kind of like one way", "tokens": [51540, 428, 1065, 20215, 7098, 281, 362, 2047, 2105, 281, 264, 4705, 558, 370, 300, 311, 733, 295, 411, 472, 636, 51788], "temperature": 0.0, "avg_logprob": -0.09657411836087704, "compression_ratio": 2.066901408450704, "no_speech_prob": 0.004746066872030497}, {"id": 230, "seek": 16560, "start": 1345.6, "end": 1349.6, "text": " that you might enforce that sort of guard rail so use that local model just like generate summaries", "tokens": [50364, 300, 291, 1062, 24825, 300, 1333, 295, 6290, 8765, 370, 764, 300, 2654, 2316, 445, 411, 8460, 8367, 4889, 50564], "temperature": 0.0, "avg_logprob": -0.10083262255149228, "compression_ratio": 1.9315960912052117, "no_speech_prob": 0.0007501086802221835}, {"id": 231, "seek": 16560, "start": 1349.6, "end": 1354.08, "text": " or do analysis and stuff and then just ship the summaries after some review or something like that", "tokens": [50564, 420, 360, 5215, 293, 1507, 293, 550, 445, 5374, 264, 8367, 4889, 934, 512, 3131, 420, 746, 411, 300, 50788], "temperature": 0.0, "avg_logprob": -0.10083262255149228, "compression_ratio": 1.9315960912052117, "no_speech_prob": 0.0007501086802221835}, {"id": 232, "seek": 16560, "start": 1354.08, "end": 1358.16, "text": " to you know that's like that's one way to to play that game and there are a whole bunch of other", "tokens": [50788, 281, 291, 458, 300, 311, 411, 300, 311, 472, 636, 281, 281, 862, 300, 1216, 293, 456, 366, 257, 1379, 3840, 295, 661, 50992], "temperature": 0.0, "avg_logprob": -0.10083262255149228, "compression_ratio": 1.9315960912052117, "no_speech_prob": 0.0007501086802221835}, {"id": 233, "seek": 16560, "start": 1358.16, "end": 1362.24, "text": " guard rails around the kind of access that different models can have in your ability to net so that's", "tokens": [50992, 6290, 27649, 926, 264, 733, 295, 2105, 300, 819, 5245, 393, 362, 294, 428, 3485, 281, 2533, 370, 300, 311, 51196], "temperature": 0.0, "avg_logprob": -0.10083262255149228, "compression_ratio": 1.9315960912052117, "no_speech_prob": 0.0007501086802221835}, {"id": 234, "seek": 16560, "start": 1362.24, "end": 1368.96, "text": " kind of like a lot of where this is coming from you know this is a classic example of jensen's", "tokens": [51196, 733, 295, 411, 257, 688, 295, 689, 341, 307, 1348, 490, 291, 458, 341, 307, 257, 7230, 1365, 295, 361, 32934, 311, 51532], "temperature": 0.0, "avg_logprob": -0.10083262255149228, "compression_ratio": 1.9315960912052117, "no_speech_prob": 0.0007501086802221835}, {"id": 235, "seek": 16560, "start": 1368.96, "end": 1374.56, "text": " hyperbolic rhetoric right he's he's using terms like new renaissance and software to kind of to play", "tokens": [51532, 9848, 65, 7940, 29604, 558, 415, 311, 415, 311, 1228, 2115, 411, 777, 319, 629, 14431, 293, 4722, 281, 733, 295, 281, 862, 51812], "temperature": 0.0, "avg_logprob": -0.10083262255149228, "compression_ratio": 1.9315960912052117, "no_speech_prob": 0.0007501086802221835}, {"id": 236, "seek": 19456, "start": 1374.56, "end": 1379.6, "text": " this up which to be clear I actually I mean I agree with this but there's like there's a gap between", "tokens": [50364, 341, 493, 597, 281, 312, 1850, 286, 767, 286, 914, 286, 3986, 365, 341, 457, 456, 311, 411, 456, 311, 257, 7417, 1296, 50616], "temperature": 0.0, "avg_logprob": -0.08511293518231876, "compression_ratio": 1.879746835443038, "no_speech_prob": 0.0027922780718654394}, {"id": 237, "seek": 19456, "start": 1379.6, "end": 1384.4, "text": " you know this framework can help you install you know a single demand and redefining how computing is", "tokens": [50616, 291, 458, 341, 8388, 393, 854, 291, 3625, 291, 458, 257, 2167, 4733, 293, 38818, 1760, 577, 15866, 307, 50856], "temperature": 0.0, "avg_logprob": -0.08511293518231876, "compression_ratio": 1.879746835443038, "no_speech_prob": 0.0027922780718654394}, {"id": 238, "seek": 19456, "start": 1384.4, "end": 1389.12, "text": " done and we'll see if that that chasm gets gets crossed and it it probably well at some point man", "tokens": [50856, 1096, 293, 321, 603, 536, 498, 300, 300, 417, 14774, 2170, 2170, 14622, 293, 309, 309, 1391, 731, 412, 512, 935, 587, 51092], "temperature": 0.0, "avg_logprob": -0.08511293518231876, "compression_ratio": 1.879746835443038, "no_speech_prob": 0.0027922780718654394}, {"id": 239, "seek": 19456, "start": 1389.12, "end": 1394.08, "text": " I think it's certainly well at some point question is who does it first so the frame here is you", "tokens": [51092, 286, 519, 309, 311, 3297, 731, 412, 512, 935, 1168, 307, 567, 775, 309, 700, 370, 264, 3920, 510, 307, 291, 51340], "temperature": 0.0, "avg_logprob": -0.08511293518231876, "compression_ratio": 1.879746835443038, "no_speech_prob": 0.0027922780718654394}, {"id": 240, "seek": 19456, "start": 1394.08, "end": 1398.16, "text": " know again that operating system piece right keep going back to this it's not a coincidence that", "tokens": [51340, 458, 797, 300, 7447, 1185, 2522, 558, 1066, 516, 646, 281, 341, 309, 311, 406, 257, 22137, 300, 51544], "temperature": 0.0, "avg_logprob": -0.08511293518231876, "compression_ratio": 1.879746835443038, "no_speech_prob": 0.0027922780718654394}, {"id": 241, "seek": 19456, "start": 1398.16, "end": 1403.52, "text": " everybody is is going on the same basically this gold rush expedition everybody's thinking the same", "tokens": [51544, 2201, 307, 307, 516, 322, 264, 912, 1936, 341, 3821, 9300, 30359, 2201, 311, 1953, 264, 912, 51812], "temperature": 0.0, "avg_logprob": -0.08511293518231876, "compression_ratio": 1.879746835443038, "no_speech_prob": 0.0027922780718654394}, {"id": 242, "seek": 22352, "start": 1403.52, "end": 1410.24, "text": " thing the operating system for personal AI that layer and again jensen here comparing mac and", "tokens": [50364, 551, 264, 7447, 1185, 337, 2973, 7318, 300, 4583, 293, 797, 361, 32934, 510, 15763, 7912, 293, 50700], "temperature": 0.0, "avg_logprob": -0.153874014377594, "compression_ratio": 1.9554794520547945, "no_speech_prob": 0.0010022995993494987}, {"id": 243, "seek": 22352, "start": 1410.24, "end": 1415.3600000000001, "text": " windows for PCs to open cloth for personal AI that's a deliberate a bold clamp a very deliberate", "tokens": [50700, 9309, 337, 46913, 281, 1269, 13619, 337, 2973, 7318, 300, 311, 257, 30515, 257, 11928, 17690, 257, 588, 30515, 50956], "temperature": 0.0, "avg_logprob": -0.153874014377594, "compression_ratio": 1.9554794520547945, "no_speech_prob": 0.0010022995993494987}, {"id": 244, "seek": 22352, "start": 1415.3600000000001, "end": 1420.48, "text": " this is part of that frame this is Nvidia now saying hey meta is going to get into the effectively", "tokens": [50956, 341, 307, 644, 295, 300, 3920, 341, 307, 46284, 586, 1566, 4177, 19616, 307, 516, 281, 483, 666, 264, 8659, 51212], "temperature": 0.0, "avg_logprob": -0.153874014377594, "compression_ratio": 1.9554794520547945, "no_speech_prob": 0.0010022995993494987}, {"id": 245, "seek": 22352, "start": 1420.48, "end": 1424.72, "text": " the operating system game the the sort of operating system for agents we're going to do the same", "tokens": [51212, 264, 7447, 1185, 1216, 264, 264, 1333, 295, 7447, 1185, 337, 12554, 321, 434, 516, 281, 360, 264, 912, 51424], "temperature": 0.0, "avg_logprob": -0.153874014377594, "compression_ratio": 1.9554794520547945, "no_speech_prob": 0.0010022995993494987}, {"id": 246, "seek": 22352, "start": 1424.72, "end": 1428.88, "text": " thing we don't have history quite of of doing that but you know now we're diving into so you're", "tokens": [51424, 551, 321, 500, 380, 362, 2503, 1596, 295, 295, 884, 300, 457, 291, 458, 586, 321, 434, 20241, 666, 370, 291, 434, 51632], "temperature": 0.0, "avg_logprob": -0.153874014377594, "compression_ratio": 1.9554794520547945, "no_speech_prob": 0.0010022995993494987}, {"id": 247, "seek": 22352, "start": 1428.88, "end": 1433.28, "text": " creating creating this environment where because essentially agentic you know we had the", "tokens": [51632, 4084, 4084, 341, 2823, 689, 570, 4476, 9461, 299, 291, 458, 321, 632, 264, 51852], "temperature": 0.0, "avg_logprob": -0.153874014377594, "compression_ratio": 1.9554794520547945, "no_speech_prob": 0.0010022995993494987}, {"id": 248, "seek": 25328, "start": 1433.28, "end": 1438.32, "text": " software eats the world era of the sort of sass revolution you know in the last decade decade", "tokens": [50364, 4722, 18109, 264, 1002, 4249, 295, 264, 1333, 295, 262, 640, 8894, 291, 458, 294, 264, 1036, 10378, 10378, 50616], "temperature": 0.0, "avg_logprob": -0.11241141223797092, "compression_ratio": 1.8091603053435115, "no_speech_prob": 0.0001351827959297225}, {"id": 249, "seek": 25328, "start": 1438.32, "end": 1443.12, "text": " and a half two decades and now we're in the AI is eating the world including software and another", "tokens": [50616, 293, 257, 1922, 732, 7878, 293, 586, 321, 434, 294, 264, 7318, 307, 3936, 264, 1002, 3009, 4722, 293, 1071, 50856], "temperature": 0.0, "avg_logprob": -0.11241141223797092, "compression_ratio": 1.8091603053435115, "no_speech_prob": 0.0001351827959297225}, {"id": 250, "seek": 25328, "start": 1443.12, "end": 1448.48, "text": " layer of abstraction here is yeah that runtime environment for agents and and that's the land", "tokens": [50856, 4583, 295, 37765, 510, 307, 1338, 300, 34474, 2823, 337, 12554, 293, 293, 300, 311, 264, 2117, 51124], "temperature": 0.0, "avg_logprob": -0.11241141223797092, "compression_ratio": 1.8091603053435115, "no_speech_prob": 0.0001351827959297225}, {"id": 251, "seek": 25328, "start": 1448.48, "end": 1452.6399999999999, "text": " ram that's the operating system at least this is the case that jensen's making I think it's a", "tokens": [51124, 10211, 300, 311, 264, 7447, 1185, 412, 1935, 341, 307, 264, 1389, 300, 361, 32934, 311, 1455, 286, 519, 309, 311, 257, 51332], "temperature": 0.0, "avg_logprob": -0.11241141223797092, "compression_ratio": 1.8091603053435115, "no_speech_prob": 0.0001351827959297225}, {"id": 252, "seek": 25328, "start": 1452.6399999999999, "end": 1457.04, "text": " reasonable case on the whole but remains to be seen this is functionally also a classic Nvidia", "tokens": [51332, 10585, 1389, 322, 264, 1379, 457, 7023, 281, 312, 1612, 341, 307, 2445, 379, 611, 257, 7230, 46284, 51552], "temperature": 0.0, "avg_logprob": -0.11241141223797092, "compression_ratio": 1.8091603053435115, "no_speech_prob": 0.0001351827959297225}, {"id": 253, "seek": 27704, "start": 1457.04, "end": 1464.32, "text": " play of like trying to trojan horse in their full stack so nemoclaw is going to install nematron", "tokens": [50364, 862, 295, 411, 1382, 281, 4495, 14763, 6832, 294, 641, 1577, 8630, 370, 9939, 905, 5901, 307, 516, 281, 3625, 9939, 267, 2044, 50728], "temperature": 0.0, "avg_logprob": -0.11234074824110225, "compression_ratio": 1.7830882352941178, "no_speech_prob": 0.002416132250800729}, {"id": 254, "seek": 27704, "start": 1464.32, "end": 1469.92, "text": " models right those Nvidia models preferentially and the open shell runtime all in one command the", "tokens": [50728, 5245, 558, 729, 46284, 5245, 4382, 3137, 293, 264, 1269, 8720, 34474, 439, 294, 472, 5622, 264, 51008], "temperature": 0.0, "avg_logprob": -0.11234074824110225, "compression_ratio": 1.7830882352941178, "no_speech_prob": 0.002416132250800729}, {"id": 255, "seek": 27704, "start": 1469.92, "end": 1475.8400000000001, "text": " simplicity here is the point so it's super super easy to deploy Nvidia's own models together and", "tokens": [51008, 25632, 510, 307, 264, 935, 370, 309, 311, 1687, 1687, 1858, 281, 7274, 46284, 311, 1065, 5245, 1214, 293, 51304], "temperature": 0.0, "avg_logprob": -0.11234074824110225, "compression_ratio": 1.7830882352941178, "no_speech_prob": 0.002416132250800729}, {"id": 256, "seek": 27704, "start": 1475.8400000000001, "end": 1479.92, "text": " the runtime environment all that stuff basically they're creating a situation where just like they", "tokens": [51304, 264, 34474, 2823, 439, 300, 1507, 1936, 436, 434, 4084, 257, 2590, 689, 445, 411, 436, 51508], "temperature": 0.0, "avg_logprob": -0.11234074824110225, "compression_ratio": 1.7830882352941178, "no_speech_prob": 0.002416132250800729}, {"id": 257, "seek": 27704, "start": 1479.92, "end": 1485.6, "text": " owned kuda for the the GPUs they're creating a whole software stack around agentic the agentic", "tokens": [51508, 11684, 350, 11152, 337, 264, 264, 18407, 82, 436, 434, 4084, 257, 1379, 4722, 8630, 926, 9461, 299, 264, 9461, 299, 51792], "temperature": 0.0, "avg_logprob": -0.11234074824110225, "compression_ratio": 1.7830882352941178, "no_speech_prob": 0.002416132250800729}, {"id": 258, "seek": 30560, "start": 1485.6, "end": 1490.48, "text": " runtime environment and so this is what's worked for them in the past create a whole ecosystem around", "tokens": [50364, 34474, 2823, 293, 370, 341, 307, 437, 311, 2732, 337, 552, 294, 264, 1791, 1884, 257, 1379, 11311, 926, 50608], "temperature": 0.0, "avg_logprob": -0.08731050796520251, "compression_ratio": 1.7822878228782288, "no_speech_prob": 0.0004945959663018584}, {"id": 259, "seek": 30560, "start": 1490.48, "end": 1495.44, "text": " this software is more commoditized now that it has been before so that may not be the same kind of", "tokens": [50608, 341, 4722, 307, 544, 19931, 270, 1602, 586, 300, 309, 575, 668, 949, 370, 300, 815, 406, 312, 264, 912, 733, 295, 50856], "temperature": 0.0, "avg_logprob": -0.08731050796520251, "compression_ratio": 1.7822878228782288, "no_speech_prob": 0.0004945959663018584}, {"id": 260, "seek": 30560, "start": 1495.44, "end": 1499.52, "text": " moat especially given that they're you know unlike before with kuda where they had like a decades", "tokens": [50856, 705, 267, 2318, 2212, 300, 436, 434, 291, 458, 8343, 949, 365, 350, 11152, 689, 436, 632, 411, 257, 7878, 51060], "temperature": 0.0, "avg_logprob": -0.08731050796520251, "compression_ratio": 1.7822878228782288, "no_speech_prob": 0.0004945959663018584}, {"id": 261, "seek": 30560, "start": 1499.52, "end": 1503.6, "text": " head start for anyone really paid attention this is now you know much more in competition with", "tokens": [51060, 1378, 722, 337, 2878, 534, 4835, 3202, 341, 307, 586, 291, 458, 709, 544, 294, 6211, 365, 51264], "temperature": 0.0, "avg_logprob": -0.08731050796520251, "compression_ratio": 1.7822878228782288, "no_speech_prob": 0.0004945959663018584}, {"id": 262, "seek": 30560, "start": 1503.6, "end": 1508.0, "text": " kind of faster moving players so really interesting again chocolate up is another another", "tokens": [51264, 733, 295, 4663, 2684, 4150, 370, 534, 1880, 797, 6215, 493, 307, 1071, 1071, 51484], "temperature": 0.0, "avg_logprob": -0.08731050796520251, "compression_ratio": 1.7822878228782288, "no_speech_prob": 0.0004945959663018584}, {"id": 263, "seek": 32800, "start": 1508.0, "end": 1514.48, "text": " entrant in the sort of operating system for the agents sort of catalog here right I do think", "tokens": [50364, 948, 7541, 294, 264, 1333, 295, 7447, 1185, 337, 264, 12554, 1333, 295, 19746, 510, 558, 286, 360, 519, 50688], "temperature": 0.0, "avg_logprob": -0.1877736556725424, "compression_ratio": 1.622093023255814, "no_speech_prob": 0.002697878982871771}, {"id": 264, "seek": 32800, "start": 1515.6, "end": 1523.28, "text": " interesting aspects is nemaclaw is is kind of like the hype really like good cool branding for", "tokens": [50744, 1880, 7270, 307, 9939, 326, 5901, 307, 307, 733, 295, 411, 264, 24144, 534, 411, 665, 1627, 27279, 337, 51128], "temperature": 0.0, "avg_logprob": -0.1877736556725424, "compression_ratio": 1.622093023255814, "no_speech_prob": 0.002697878982871771}, {"id": 265, "seek": 32800, "start": 1523.28, "end": 1530.72, "text": " the actual major push which is their open agent development platform which is that terminal", "tokens": [51128, 264, 3539, 2563, 2944, 597, 307, 641, 1269, 9461, 3250, 3663, 597, 307, 300, 14709, 51500], "temperature": 0.0, "avg_logprob": -0.1877736556725424, "compression_ratio": 1.622093023255814, "no_speech_prob": 0.002697878982871771}, {"id": 266, "seek": 35072, "start": 1530.8, "end": 1538.88, "text": " based thing it also comes with this Nvidia AIQ blueprint thing which is built on top of", "tokens": [50368, 2361, 551, 309, 611, 1487, 365, 341, 46284, 7318, 48, 35868, 551, 597, 307, 3094, 322, 1192, 295, 50772], "temperature": 0.0, "avg_logprob": -0.15345662496984006, "compression_ratio": 1.6576576576576576, "no_speech_prob": 0.0042541841976344585}, {"id": 267, "seek": 35072, "start": 1538.88, "end": 1545.76, "text": " length chain deep agents and has Nvidia Nemo agent toolkit as these open source kind of options", "tokens": [50772, 4641, 5021, 2452, 12554, 293, 575, 46284, 22210, 78, 9461, 40167, 382, 613, 1269, 4009, 733, 295, 3956, 51116], "temperature": 0.0, "avg_logprob": -0.15345662496984006, "compression_ratio": 1.6576576576576576, "no_speech_prob": 0.0042541841976344585}, {"id": 268, "seek": 35072, "start": 1545.76, "end": 1553.3600000000001, "text": " for building your stack of agents so nemaclaw aspect is like digging back on the thing that", "tokens": [51116, 337, 2390, 428, 8630, 295, 12554, 370, 9939, 326, 5901, 4171, 307, 411, 17343, 646, 322, 264, 551, 300, 51496], "temperature": 0.0, "avg_logprob": -0.15345662496984006, "compression_ratio": 1.6576576576576576, "no_speech_prob": 0.0042541841976344585}, {"id": 269, "seek": 35072, "start": 1553.3600000000001, "end": 1560.3200000000002, "text": " everyone is hyped about and then the actual software frameworks is probably the actual thing", "tokens": [51496, 1518, 307, 43172, 466, 293, 550, 264, 3539, 4722, 29834, 307, 1391, 264, 3539, 551, 51844], "temperature": 0.0, "avg_logprob": -0.15345662496984006, "compression_ratio": 1.6576576576576576, "no_speech_prob": 0.0042541841976344585}, {"id": 270, "seek": 38032, "start": 1560.32, "end": 1567.28, "text": " that Nvidia cares about yeah and this is it right it's like how can I hook on to one big hype train", "tokens": [50364, 300, 46284, 12310, 466, 1338, 293, 341, 307, 309, 558, 309, 311, 411, 577, 393, 286, 6328, 322, 281, 472, 955, 24144, 3847, 50712], "temperature": 0.0, "avg_logprob": -0.11671263514420925, "compression_ratio": 1.6722972972972974, "no_speech_prob": 0.002798121189698577}, {"id": 271, "seek": 38032, "start": 1567.28, "end": 1572.08, "text": " and then exactly use that as a Trojan to get her like everybody dependent on her whole stack including", "tokens": [50712, 293, 550, 2293, 764, 300, 382, 257, 19406, 14763, 281, 483, 720, 411, 2201, 12334, 322, 720, 1379, 8630, 3009, 50952], "temperature": 0.0, "avg_logprob": -0.11671263514420925, "compression_ratio": 1.6722972972972974, "no_speech_prob": 0.002798121189698577}, {"id": 272, "seek": 38032, "start": 1572.08, "end": 1576.48, "text": " their models because there hasn't been the kind of uptake of the nematron models at least that I", "tokens": [50952, 641, 5245, 570, 456, 6132, 380, 668, 264, 733, 295, 493, 27612, 295, 264, 9939, 267, 2044, 5245, 412, 1935, 300, 286, 51172], "temperature": 0.0, "avg_logprob": -0.11671263514420925, "compression_ratio": 1.6722972972972974, "no_speech_prob": 0.002798121189698577}, {"id": 273, "seek": 38032, "start": 1576.48, "end": 1582.1599999999999, "text": " expected to see so far and so this is you know one one opportunity for them to make that happen yeah", "tokens": [51172, 5176, 281, 536, 370, 1400, 293, 370, 341, 307, 291, 458, 472, 472, 2650, 337, 552, 281, 652, 300, 1051, 1338, 51456], "temperature": 0.0, "avg_logprob": -0.11671263514420925, "compression_ratio": 1.6722972972972974, "no_speech_prob": 0.002798121189698577}, {"id": 274, "seek": 38032, "start": 1582.1599999999999, "end": 1590.24, "text": " I think to your point about software kuda obviously is is the number one thing for GPU related", "tokens": [51456, 286, 519, 281, 428, 935, 466, 4722, 350, 11152, 2745, 307, 307, 264, 1230, 472, 551, 337, 18407, 4077, 51860], "temperature": 0.0, "avg_logprob": -0.11671263514420925, "compression_ratio": 1.6722972972972974, "no_speech_prob": 0.002798121189698577}, {"id": 275, "seek": 41024, "start": 1590.24, "end": 1596.32, "text": " kind of execution software but as far as open source packages go and video hasn't had a", "tokens": [50364, 733, 295, 15058, 4722, 457, 382, 1400, 382, 1269, 4009, 17401, 352, 293, 960, 6132, 380, 632, 257, 50668], "temperature": 0.0, "avg_logprob": -0.1595790739412661, "compression_ratio": 1.6211453744493391, "no_speech_prob": 0.0005076000234112144}, {"id": 276, "seek": 41024, "start": 1596.32, "end": 1601.68, "text": " history of really making an impact so for instance length chain is an open source package that", "tokens": [50668, 2503, 295, 534, 1455, 364, 2712, 370, 337, 5197, 4641, 5021, 307, 364, 1269, 4009, 7372, 300, 50936], "temperature": 0.0, "avg_logprob": -0.1595790739412661, "compression_ratio": 1.6211453744493391, "no_speech_prob": 0.0005076000234112144}, {"id": 277, "seek": 41024, "start": 1601.68, "end": 1610.72, "text": " goes back to 2023 that has been a popular for building complex graphs of LLM prompts and agents", "tokens": [50936, 1709, 646, 281, 44377, 300, 575, 668, 257, 3743, 337, 2390, 3997, 24877, 295, 441, 43, 44, 41095, 293, 12554, 51388], "temperature": 0.0, "avg_logprob": -0.1595790739412661, "compression_ratio": 1.6211453744493391, "no_speech_prob": 0.0005076000234112144}, {"id": 278, "seek": 41024, "start": 1610.72, "end": 1617.12, "text": " and so on probably overly complex but anyway it was at an gained broad community adoption", "tokens": [51388, 293, 370, 322, 1391, 24324, 3997, 457, 4033, 309, 390, 412, 364, 12634, 4152, 1768, 19215, 51708], "temperature": 0.0, "avg_logprob": -0.1595790739412661, "compression_ratio": 1.6211453744493391, "no_speech_prob": 0.0005076000234112144}, {"id": 279, "seek": 43712, "start": 1617.76, "end": 1625.44, "text": " so that's more or less been the pattern a lot but with agents and now open claw and whatever", "tokens": [50396, 370, 300, 311, 544, 420, 1570, 668, 264, 5102, 257, 688, 457, 365, 12554, 293, 586, 1269, 32019, 293, 2035, 50780], "temperature": 0.0, "avg_logprob": -0.150761911997924, "compression_ratio": 1.5494505494505495, "no_speech_prob": 0.00911587942391634}, {"id": 280, "seek": 43712, "start": 1625.44, "end": 1631.68, "text": " that was a good time to try and get in on that open source kind of game and one more story about", "tokens": [50780, 300, 390, 257, 665, 565, 281, 853, 293, 483, 294, 322, 300, 1269, 4009, 733, 295, 1216, 293, 472, 544, 1657, 466, 51092], "temperature": 0.0, "avg_logprob": -0.150761911997924, "compression_ratio": 1.5494505494505495, "no_speech_prob": 0.00911587942391634}, {"id": 281, "seek": 43712, "start": 1631.68, "end": 1642.6399999999999, "text": " in video at gtc they announced dlss 5 which is what they're describing as the GPT moment for", "tokens": [51092, 294, 960, 412, 290, 83, 66, 436, 7548, 274, 75, 3810, 1025, 597, 307, 437, 436, 434, 16141, 382, 264, 26039, 51, 1623, 337, 51640], "temperature": 0.0, "avg_logprob": -0.150761911997924, "compression_ratio": 1.5494505494505495, "no_speech_prob": 0.00911587942391634}, {"id": 282, "seek": 46264, "start": 1642.6399999999999, "end": 1652.72, "text": " graphics so this is runtime enhancement I suppose for game graphics so this uses machine learning", "tokens": [50364, 11837, 370, 341, 307, 34474, 40776, 286, 7297, 337, 1216, 11837, 370, 341, 4960, 3479, 2539, 50868], "temperature": 0.0, "avg_logprob": -0.06767426147347405, "compression_ratio": 1.5783783783783785, "no_speech_prob": 0.013522029854357243}, {"id": 283, "seek": 46264, "start": 1652.72, "end": 1660.24, "text": " based upscaling and it applies generative AI to add a whole bunch of just really nice looking", "tokens": [50868, 2361, 493, 4417, 4270, 293, 309, 13165, 1337, 1166, 7318, 281, 909, 257, 1379, 3840, 295, 445, 534, 1481, 1237, 51244], "temperature": 0.0, "avg_logprob": -0.06767426147347405, "compression_ratio": 1.5783783783783785, "no_speech_prob": 0.013522029854357243}, {"id": 284, "seek": 46264, "start": 1660.24, "end": 1666.0, "text": " graphics so if you look they have various examples so you probably want to just see it to understand", "tokens": [51244, 11837, 370, 498, 291, 574, 436, 362, 3683, 5110, 370, 291, 1391, 528, 281, 445, 536, 309, 281, 1223, 51532], "temperature": 0.0, "avg_logprob": -0.06767426147347405, "compression_ratio": 1.5783783783783785, "no_speech_prob": 0.013522029854357243}, {"id": 285, "seek": 48600, "start": 1666.48, "end": 1674.08, "text": " it basically you can go to older games like other schools oblivion for instance and get really", "tokens": [50388, 309, 1936, 291, 393, 352, 281, 4906, 2813, 411, 661, 4656, 47039, 313, 337, 5197, 293, 483, 534, 50768], "temperature": 0.0, "avg_logprob": -0.17669732846430877, "compression_ratio": 1.5238095238095237, "no_speech_prob": 0.012103604152798653}, {"id": 286, "seek": 48600, "start": 1674.08, "end": 1681.3600000000001, "text": " cutting edge seeming graphics with this turn down now the reaction to has been very much split", "tokens": [50768, 6492, 4691, 1643, 278, 11837, 365, 341, 1261, 760, 586, 264, 5480, 281, 575, 668, 588, 709, 7472, 51132], "temperature": 0.0, "avg_logprob": -0.17669732846430877, "compression_ratio": 1.5238095238095237, "no_speech_prob": 0.012103604152798653}, {"id": 287, "seek": 48600, "start": 1681.3600000000001, "end": 1688.6399999999999, "text": " from what I've seen people have often sculpted it and we're like oh this is AI slop this filter is", "tokens": [51132, 490, 437, 286, 600, 1612, 561, 362, 2049, 12613, 292, 309, 293, 321, 434, 411, 1954, 341, 307, 7318, 21254, 341, 6608, 307, 51496], "temperature": 0.0, "avg_logprob": -0.17669732846430877, "compression_ratio": 1.5238095238095237, "no_speech_prob": 0.012103604152798653}, {"id": 288, "seek": 50864, "start": 1689.2, "end": 1697.76, "text": " is bad in some cases it seems to go against the style of the game somewhat so Nvidia did also", "tokens": [50392, 307, 1578, 294, 512, 3331, 309, 2544, 281, 352, 1970, 264, 3758, 295, 264, 1216, 8344, 370, 46284, 630, 611, 50820], "temperature": 0.0, "avg_logprob": -0.10590946650694287, "compression_ratio": 1.6067415730337078, "no_speech_prob": 0.004220383241772652}, {"id": 289, "seek": 50864, "start": 1697.76, "end": 1704.32, "text": " emphasize that this is fully controllable by the developer the developer can set the settings", "tokens": [50820, 16078, 300, 341, 307, 4498, 45159, 712, 538, 264, 10754, 264, 10754, 393, 992, 264, 6257, 51148], "temperature": 0.0, "avg_logprob": -0.10590946650694287, "compression_ratio": 1.6067415730337078, "no_speech_prob": 0.004220383241772652}, {"id": 290, "seek": 50864, "start": 1704.32, "end": 1712.4, "text": " of how aggressive it is you know how much it impacts various aspects of the rendering yeah I think", "tokens": [51148, 295, 577, 10762, 309, 307, 291, 458, 577, 709, 309, 11606, 3683, 7270, 295, 264, 22407, 1338, 286, 519, 51552], "temperature": 0.0, "avg_logprob": -0.10590946650694287, "compression_ratio": 1.6067415730337078, "no_speech_prob": 0.004220383241772652}, {"id": 291, "seek": 53240, "start": 1712.4, "end": 1719.12, "text": " this is the kind of thing that it has given developers presumably in the past this is dlss 5", "tokens": [50364, 341, 307, 264, 733, 295, 551, 300, 309, 575, 2212, 8849, 26742, 294, 264, 1791, 341, 307, 274, 75, 3810, 1025, 50700], "temperature": 0.0, "avg_logprob": -0.1313845049916652, "compression_ratio": 1.7735849056603774, "no_speech_prob": 0.004048927687108517}, {"id": 292, "seek": 53240, "start": 1719.76, "end": 1725.28, "text": " but with the generative AI aspect of it it's getting a lot more discussion and it looks", "tokens": [50732, 457, 365, 264, 1337, 1166, 7318, 4171, 295, 309, 309, 311, 1242, 257, 688, 544, 5017, 293, 309, 1542, 51008], "temperature": 0.0, "avg_logprob": -0.1313845049916652, "compression_ratio": 1.7735849056603774, "no_speech_prob": 0.004048927687108517}, {"id": 293, "seek": 53240, "start": 1725.8400000000001, "end": 1730.48, "text": " potentially very very significant well so I was going to ask you I mean this is the part of the", "tokens": [51036, 7263, 588, 588, 4776, 731, 370, 286, 390, 516, 281, 1029, 291, 286, 914, 341, 307, 264, 644, 295, 264, 51268], "temperature": 0.0, "avg_logprob": -0.1313845049916652, "compression_ratio": 1.7735849056603774, "no_speech_prob": 0.004048927687108517}, {"id": 294, "seek": 53240, "start": 1730.48, "end": 1735.3600000000001, "text": " show where I ask you for your opinion as a guy at astrocade like is this something that you guys", "tokens": [51268, 855, 689, 286, 1029, 291, 337, 428, 4800, 382, 257, 2146, 412, 5357, 340, 30340, 411, 307, 341, 746, 300, 291, 1074, 51512], "temperature": 0.0, "avg_logprob": -0.1313845049916652, "compression_ratio": 1.7735849056603774, "no_speech_prob": 0.004048927687108517}, {"id": 295, "seek": 53240, "start": 1735.3600000000001, "end": 1739.6, "text": " would integrate in your stack like what how do you how do you think about new tools like that or", "tokens": [51512, 576, 13365, 294, 428, 8630, 411, 437, 577, 360, 291, 577, 360, 291, 519, 466, 777, 3873, 411, 300, 420, 51724], "temperature": 0.0, "avg_logprob": -0.1313845049916652, "compression_ratio": 1.7735849056603774, "no_speech_prob": 0.004048927687108517}, {"id": 296, "seek": 55960, "start": 1739.6, "end": 1746.24, "text": " is that even part of your your workflow yeah so this this is kind of targeting that 3d", "tokens": [50364, 307, 300, 754, 644, 295, 428, 428, 20993, 1338, 370, 341, 341, 307, 733, 295, 17918, 300, 805, 67, 50696], "temperature": 0.0, "avg_logprob": -0.1546308206660407, "compression_ratio": 1.784688995215311, "no_speech_prob": 0.010368079878389835}, {"id": 297, "seek": 55960, "start": 1746.24, "end": 1754.0, "text": " triple a big game kind of market right so this is dealing with new games right then and more", "tokens": [50696, 15508, 257, 955, 1216, 733, 295, 2142, 558, 370, 341, 307, 6260, 365, 777, 2813, 558, 550, 293, 544, 51084], "temperature": 0.0, "avg_logprob": -0.1546308206660407, "compression_ratio": 1.784688995215311, "no_speech_prob": 0.010368079878389835}, {"id": 298, "seek": 55960, "start": 1754.0, "end": 1761.6, "text": " complex you wouldn't run this on your phone like for casual games but for games that have complex", "tokens": [51084, 3997, 291, 2759, 380, 1190, 341, 322, 428, 2593, 411, 337, 13052, 2813, 457, 337, 2813, 300, 362, 3997, 51464], "temperature": 0.0, "avg_logprob": -0.1546308206660407, "compression_ratio": 1.784688995215311, "no_speech_prob": 0.010368079878389835}, {"id": 299, "seek": 55960, "start": 1761.6, "end": 1768.88, "text": " character models with faces and stuff like that or like open world games where you traverse big", "tokens": [51464, 2517, 5245, 365, 8475, 293, 1507, 411, 300, 420, 411, 1269, 1002, 2813, 689, 291, 45674, 955, 51828], "temperature": 0.0, "avg_logprob": -0.1546308206660407, "compression_ratio": 1.784688995215311, "no_speech_prob": 0.010368079878389835}, {"id": 300, "seek": 58888, "start": 1768.88, "end": 1778.88, "text": " landscapes that's where the kind of photorealistic pass really makes a big difference and last up we", "tokens": [50364, 29822, 300, 311, 689, 264, 733, 295, 2409, 418, 304, 3142, 1320, 534, 1669, 257, 955, 2649, 293, 1036, 493, 321, 50864], "temperature": 0.0, "avg_logprob": -0.15786734360914964, "compression_ratio": 1.5934065934065933, "no_speech_prob": 0.002731841756030917}, {"id": 301, "seek": 58888, "start": 1778.88, "end": 1787.6, "text": " have an update on open areas plan to launch strategy parties adult mode this has been announced", "tokens": [50864, 362, 364, 5623, 322, 1269, 3179, 1393, 281, 4025, 5206, 8265, 5075, 4391, 341, 575, 668, 7548, 51300], "temperature": 0.0, "avg_logprob": -0.15786734360914964, "compression_ratio": 1.5934065934065933, "no_speech_prob": 0.002731841756030917}, {"id": 302, "seek": 58888, "start": 1787.6, "end": 1794.24, "text": " I think last year as I think they are thinking of doing it has been delayed from the original", "tokens": [51300, 286, 519, 1036, 1064, 382, 286, 519, 436, 366, 1953, 295, 884, 309, 575, 668, 20268, 490, 264, 3380, 51632], "temperature": 0.0, "avg_logprob": -0.15786734360914964, "compression_ratio": 1.5934065934065933, "no_speech_prob": 0.002731841756030917}, {"id": 303, "seek": 61424, "start": 1794.24, "end": 1803.92, "text": " late March target and it seems that they're still aiming to do it the news here has been that", "tokens": [50364, 3469, 6129, 3779, 293, 309, 2544, 300, 436, 434, 920, 20253, 281, 360, 309, 264, 2583, 510, 575, 668, 300, 50848], "temperature": 0.0, "avg_logprob": -0.1753041832173457, "compression_ratio": 1.5343915343915344, "no_speech_prob": 0.02229226939380169}, {"id": 304, "seek": 61424, "start": 1803.92, "end": 1811.52, "text": " the team within open AI their advisory council of psychology and neuroscience have opposed it at", "tokens": [50848, 264, 1469, 1951, 1269, 7318, 641, 26289, 9209, 295, 15105, 293, 42762, 362, 8851, 309, 412, 51228], "temperature": 0.0, "avg_logprob": -0.1753041832173457, "compression_ratio": 1.5343915343915344, "no_speech_prob": 0.02229226939380169}, {"id": 305, "seek": 61424, "start": 1811.52, "end": 1819.92, "text": " a January meeting one advisor really warned about it significantly so anyway we have a quick update", "tokens": [51228, 257, 7061, 3440, 472, 19161, 534, 21284, 466, 309, 10591, 370, 4033, 321, 362, 257, 1702, 5623, 51648], "temperature": 0.0, "avg_logprob": -0.1753041832173457, "compression_ratio": 1.5343915343915344, "no_speech_prob": 0.02229226939380169}, {"id": 306, "seek": 63992, "start": 1819.92, "end": 1826.6399999999999, "text": " saying that they appear to still be planning it it has been delayed but as of now it's still", "tokens": [50364, 1566, 300, 436, 4204, 281, 920, 312, 5038, 309, 309, 575, 668, 20268, 457, 382, 295, 586, 309, 311, 920, 50700], "temperature": 0.0, "avg_logprob": -0.14406177970800507, "compression_ratio": 1.766355140186916, "no_speech_prob": 0.01805056445300579}, {"id": 307, "seek": 63992, "start": 1826.6399999999999, "end": 1833.1999999999998, "text": " going to be presumably released yeah well what a surprise that despite objections over the", "tokens": [50700, 516, 281, 312, 26742, 4736, 1338, 731, 437, 257, 6365, 300, 7228, 44649, 670, 264, 51028], "temperature": 0.0, "avg_logprob": -0.14406177970800507, "compression_ratio": 1.766355140186916, "no_speech_prob": 0.01805056445300579}, {"id": 308, "seek": 63992, "start": 1833.1999999999998, "end": 1838.56, "text": " appropriateness of a tool like this that they still went ahead huh that's that's weird that's", "tokens": [51028, 5745, 7186, 442, 295, 257, 2290, 411, 341, 300, 436, 920, 1437, 2286, 7020, 300, 311, 300, 311, 3657, 300, 311, 51296], "temperature": 0.0, "avg_logprob": -0.14406177970800507, "compression_ratio": 1.766355140186916, "no_speech_prob": 0.01805056445300579}, {"id": 309, "seek": 63992, "start": 1838.56, "end": 1844.88, "text": " not my company at all isn't yeah that's weird what a weird thing I know this is when you think about", "tokens": [51296, 406, 452, 2237, 412, 439, 1943, 380, 1338, 300, 311, 3657, 437, 257, 3657, 551, 286, 458, 341, 307, 562, 291, 519, 466, 51612], "temperature": 0.0, "avg_logprob": -0.14406177970800507, "compression_ratio": 1.766355140186916, "no_speech_prob": 0.01805056445300579}, {"id": 310, "seek": 66488, "start": 1845.04, "end": 1851.04, "text": " the things that were classically warned about just my opinion here personally but I seem to remember", "tokens": [50372, 264, 721, 300, 645, 1508, 984, 21284, 466, 445, 452, 4800, 510, 5665, 457, 286, 1643, 281, 1604, 50672], "temperature": 0.0, "avg_logprob": -0.11360027060772364, "compression_ratio": 1.7285714285714286, "no_speech_prob": 0.0019906864035874605}, {"id": 311, "seek": 66488, "start": 1851.04, "end": 1855.3600000000001, "text": " an awful lot of people warning us about the brave new world thing where you're hooked up to", "tokens": [50672, 364, 11232, 688, 295, 561, 9164, 505, 466, 264, 12653, 777, 1002, 551, 689, 291, 434, 20410, 493, 281, 50888], "temperature": 0.0, "avg_logprob": -0.11360027060772364, "compression_ratio": 1.7285714285714286, "no_speech_prob": 0.0019906864035874605}, {"id": 312, "seek": 66488, "start": 1855.3600000000001, "end": 1861.44, "text": " like the dopamine drip yeah I don't know ultra porn powered by like AI super intelligent like", "tokens": [50888, 411, 264, 37219, 29376, 1338, 286, 500, 380, 458, 14808, 19444, 17786, 538, 411, 7318, 1687, 13232, 411, 51192], "temperature": 0.0, "avg_logprob": -0.11360027060772364, "compression_ratio": 1.7285714285714286, "no_speech_prob": 0.0019906864035874605}, {"id": 313, "seek": 66488, "start": 1861.44, "end": 1866.6399999999999, "text": " I'm not sure how far you push the inference time compute budget before you get to that scenario but", "tokens": [51192, 286, 478, 406, 988, 577, 1400, 291, 2944, 264, 38253, 565, 14722, 4706, 949, 291, 483, 281, 300, 9005, 457, 51452], "temperature": 0.0, "avg_logprob": -0.11360027060772364, "compression_ratio": 1.7285714285714286, "no_speech_prob": 0.0019906864035874605}, {"id": 314, "seek": 66488, "start": 1867.3600000000001, "end": 1872.6399999999999, "text": " hey how about scaling laws for porn addiction how about that paper looking forward to that coming", "tokens": [51488, 4177, 577, 466, 21589, 6064, 337, 19444, 16835, 577, 466, 300, 3035, 1237, 2128, 281, 300, 1348, 51752], "temperature": 0.0, "avg_logprob": -0.11360027060772364, "compression_ratio": 1.7285714285714286, "no_speech_prob": 0.0019906864035874605}, {"id": 315, "seek": 69264, "start": 1872.6399999999999, "end": 1876.32, "text": " out I mean look that that's the kind of thing we're gonna have to see there's gonna have to be", "tokens": [50364, 484, 286, 914, 574, 300, 300, 311, 264, 733, 295, 551, 321, 434, 799, 362, 281, 536, 456, 311, 799, 362, 281, 312, 50548], "temperature": 0.0, "avg_logprob": -0.0895912363573357, "compression_ratio": 1.8471337579617835, "no_speech_prob": 0.0018159901956096292}, {"id": 316, "seek": 69264, "start": 1876.32, "end": 1880.72, "text": " studies on scaling laws for porn addiction just calling a spade a spade like I don't see how we", "tokens": [50548, 5313, 322, 21589, 6064, 337, 19444, 16835, 445, 5141, 257, 637, 762, 257, 637, 762, 411, 286, 500, 380, 536, 577, 321, 50768], "temperature": 0.0, "avg_logprob": -0.0895912363573357, "compression_ratio": 1.8471337579617835, "no_speech_prob": 0.0018159901956096292}, {"id": 317, "seek": 69264, "start": 1880.72, "end": 1885.12, "text": " avoid that it's interesting that this is this is an offering someone's gonna do it right the big", "tokens": [50768, 5042, 300, 309, 311, 1880, 300, 341, 307, 341, 307, 364, 8745, 1580, 311, 799, 360, 309, 558, 264, 955, 50988], "temperature": 0.0, "avg_logprob": -0.0895912363573357, "compression_ratio": 1.8471337579617835, "no_speech_prob": 0.0018159901956096292}, {"id": 318, "seek": 69264, "start": 1885.12, "end": 1890.24, "text": " porn companies at some point whatever so you could argue and I'm sure this is part of the ethical", "tokens": [50988, 19444, 3431, 412, 512, 935, 2035, 370, 291, 727, 9695, 293, 286, 478, 988, 341, 307, 644, 295, 264, 18890, 51244], "temperature": 0.0, "avg_logprob": -0.0895912363573357, "compression_ratio": 1.8471337579617835, "no_speech_prob": 0.0018159901956096292}, {"id": 319, "seek": 69264, "start": 1890.24, "end": 1894.6399999999999, "text": " argument that opening I might make here is look this way we can monitor the use of these things", "tokens": [51244, 6770, 300, 5193, 286, 1062, 652, 510, 307, 574, 341, 636, 321, 393, 6002, 264, 764, 295, 613, 721, 51464], "temperature": 0.0, "avg_logprob": -0.0895912363573357, "compression_ratio": 1.8471337579617835, "no_speech_prob": 0.0018159901956096292}, {"id": 320, "seek": 69264, "start": 1894.6399999999999, "end": 1899.6799999999998, "text": " ahead of time understand how people adjust to this technology maybe try to mitigate risks ahead of", "tokens": [51464, 2286, 295, 565, 1223, 577, 561, 4369, 281, 341, 2899, 1310, 853, 281, 27336, 10888, 2286, 295, 51716], "temperature": 0.0, "avg_logprob": -0.0895912363573357, "compression_ratio": 1.8471337579617835, "no_speech_prob": 0.0018159901956096292}, {"id": 321, "seek": 71968, "start": 1899.68, "end": 1904.24, "text": " time whereas porn companies probably wouldn't do the same thing sort of a similar argument to what", "tokens": [50364, 565, 9735, 19444, 3431, 1391, 2759, 380, 360, 264, 912, 551, 1333, 295, 257, 2531, 6770, 281, 437, 50592], "temperature": 0.0, "avg_logprob": -0.07915159478557832, "compression_ratio": 1.7659574468085106, "no_speech_prob": 0.0009386809542775154}, {"id": 322, "seek": 71968, "start": 1904.24, "end": 1908.24, "text": " Sam's been making historically right we want to like get these things out there because it's", "tokens": [50592, 4832, 311, 668, 1455, 16180, 558, 321, 528, 281, 411, 483, 613, 721, 484, 456, 570, 309, 311, 50792], "temperature": 0.0, "avg_logprob": -0.07915159478557832, "compression_ratio": 1.7659574468085106, "no_speech_prob": 0.0009386809542775154}, {"id": 323, "seek": 71968, "start": 1908.24, "end": 1912.56, "text": " gonna happen anyway so we want to get that feedback and be able to iterate on it which there's merit", "tokens": [50792, 799, 1051, 4033, 370, 321, 528, 281, 483, 300, 5824, 293, 312, 1075, 281, 44497, 322, 309, 597, 456, 311, 24527, 51008], "temperature": 0.0, "avg_logprob": -0.07915159478557832, "compression_ratio": 1.7659574468085106, "no_speech_prob": 0.0009386809542775154}, {"id": 324, "seek": 71968, "start": 1912.56, "end": 1917.04, "text": " to but yeah we should be under no illusions that we're crossing the Rubicon here and I think given", "tokens": [51008, 281, 457, 1338, 321, 820, 312, 833, 572, 49836, 300, 321, 434, 14712, 264, 10518, 11911, 510, 293, 286, 519, 2212, 51232], "temperature": 0.0, "avg_logprob": -0.07915159478557832, "compression_ratio": 1.7659574468085106, "no_speech_prob": 0.0009386809542775154}, {"id": 325, "seek": 71968, "start": 1917.04, "end": 1922.96, "text": " that open AI is taking this step it is on them then to come out with unbiased research that", "tokens": [51232, 300, 1269, 7318, 307, 1940, 341, 1823, 309, 307, 322, 552, 550, 281, 808, 484, 365, 517, 5614, 1937, 2132, 300, 51528], "temperature": 0.0, "avg_logprob": -0.07915159478557832, "compression_ratio": 1.7659574468085106, "no_speech_prob": 0.0009386809542775154}, {"id": 326, "seek": 71968, "start": 1922.96, "end": 1926.88, "text": " transparently looks at the effects of exactly what they're putting out there right I think that's", "tokens": [51528, 7132, 6420, 1542, 412, 264, 5065, 295, 2293, 437, 436, 434, 3372, 484, 456, 558, 286, 519, 300, 311, 51724], "temperature": 0.0, "avg_logprob": -0.07915159478557832, "compression_ratio": 1.7659574468085106, "no_speech_prob": 0.0009386809542775154}, {"id": 327, "seek": 74688, "start": 1926.88, "end": 1932.08, "text": " just a reasonable thing I think any I'd be pretty sure it's like a tobacco company going out", "tokens": [50364, 445, 257, 10585, 551, 286, 519, 604, 286, 1116, 312, 1238, 988, 309, 311, 411, 257, 22994, 2237, 516, 484, 50624], "temperature": 0.0, "avg_logprob": -0.09545715428700968, "compression_ratio": 1.7347670250896057, "no_speech_prob": 0.00033229272230528295}, {"id": 328, "seek": 74688, "start": 1932.08, "end": 1937.92, "text": " and doing something when you're playing so close to the the base of the human brainstem you're like", "tokens": [50624, 293, 884, 746, 562, 291, 434, 2433, 370, 1998, 281, 264, 264, 3096, 295, 264, 1952, 3567, 1099, 291, 434, 411, 50916], "temperature": 0.0, "avg_logprob": -0.09545715428700968, "compression_ratio": 1.7347670250896057, "no_speech_prob": 0.00033229272230528295}, {"id": 329, "seek": 74688, "start": 1937.92, "end": 1942.56, "text": " you're in different territory and so anyway it we'll see if this persists we'll see what scandals", "tokens": [50916, 291, 434, 294, 819, 11360, 293, 370, 4033, 309, 321, 603, 536, 498, 341, 868, 1751, 321, 603, 536, 437, 40273, 1124, 51148], "temperature": 0.0, "avg_logprob": -0.09545715428700968, "compression_ratio": 1.7347670250896057, "no_speech_prob": 0.00033229272230528295}, {"id": 330, "seek": 74688, "start": 1942.56, "end": 1947.12, "text": " come of it I'm sure there will be some but yeah it's in some ways not surprising I don't want", "tokens": [51148, 808, 295, 309, 286, 478, 988, 456, 486, 312, 512, 457, 1338, 309, 311, 294, 512, 2098, 406, 8830, 286, 500, 380, 528, 51376], "temperature": 0.0, "avg_logprob": -0.09545715428700968, "compression_ratio": 1.7347670250896057, "no_speech_prob": 0.00033229272230528295}, {"id": 331, "seek": 74688, "start": 1947.12, "end": 1952.8, "text": " again rip on open AI too much for this because the argument is true that at some point you know the", "tokens": [51376, 797, 12782, 322, 1269, 7318, 886, 709, 337, 341, 570, 264, 6770, 307, 2074, 300, 412, 512, 935, 291, 458, 264, 51660], "temperature": 0.0, "avg_logprob": -0.09545715428700968, "compression_ratio": 1.7347670250896057, "no_speech_prob": 0.00033229272230528295}, {"id": 332, "seek": 77280, "start": 1952.88, "end": 1957.92, "text": " big porn companies will do this and they have a history of like mistreating women and you know", "tokens": [50368, 955, 19444, 3431, 486, 360, 341, 293, 436, 362, 257, 2503, 295, 411, 3544, 44613, 2266, 293, 291, 458, 50620], "temperature": 0.0, "avg_logprob": -0.11033635170836198, "compression_ratio": 1.6796536796536796, "no_speech_prob": 0.0012100674211978912}, {"id": 333, "seek": 77280, "start": 1957.92, "end": 1962.64, "text": " doing all kinds of awful things so ethically I get it but now the proof is gonna be in the pudding", "tokens": [50620, 884, 439, 3685, 295, 11232, 721, 370, 6468, 984, 286, 483, 309, 457, 586, 264, 8177, 307, 799, 312, 294, 264, 29149, 50856], "temperature": 0.0, "avg_logprob": -0.11033635170836198, "compression_ratio": 1.6796536796536796, "no_speech_prob": 0.0012100674211978912}, {"id": 334, "seek": 77280, "start": 1962.64, "end": 1968.56, "text": " because once you put that out there and brand-wise like geez I well so I think it's worth clarifying", "tokens": [50856, 570, 1564, 291, 829, 300, 484, 456, 293, 3360, 12, 3711, 411, 46108, 286, 731, 370, 286, 519, 309, 311, 3163, 6093, 5489, 51152], "temperature": 0.0, "avg_logprob": -0.11033635170836198, "compression_ratio": 1.6796536796536796, "no_speech_prob": 0.0012100674211978912}, {"id": 335, "seek": 77280, "start": 1968.56, "end": 1976.56, "text": " a little bit porn might be a bit strong of a term here right so they do clarify this will not", "tokens": [51152, 257, 707, 857, 19444, 1062, 312, 257, 857, 2068, 295, 257, 1433, 510, 558, 370, 436, 360, 17594, 341, 486, 406, 51552], "temperature": 0.0, "avg_logprob": -0.11033635170836198, "compression_ratio": 1.6796536796536796, "no_speech_prob": 0.0012100674211978912}, {"id": 336, "seek": 79656, "start": 1976.64, "end": 1984.8000000000002, "text": " generate erotic audio images or videos this is really about so they've banned erotica as a", "tokens": [50368, 8460, 1189, 9411, 6278, 5267, 420, 2145, 341, 307, 534, 466, 370, 436, 600, 19564, 1189, 310, 2262, 382, 257, 50776], "temperature": 0.0, "avg_logprob": -0.20280312050829877, "compression_ratio": 1.6710526315789473, "no_speech_prob": 0.016545310616493225}, {"id": 337, "seek": 79656, "start": 1984.8000000000002, "end": 1992.96, "text": " as a general category with the chatbot and so if you try to write sexy stories or have sexy role", "tokens": [50776, 382, 257, 2674, 7719, 365, 264, 5081, 18870, 293, 370, 498, 291, 853, 281, 2464, 13701, 3676, 420, 362, 13701, 3090, 51184], "temperature": 0.0, "avg_logprob": -0.20280312050829877, "compression_ratio": 1.6710526315789473, "no_speech_prob": 0.016545310616493225}, {"id": 338, "seek": 79656, "start": 1992.96, "end": 1999.44, "text": " play which you can do and plenty of other you know apps out there is a million AI girlfriends", "tokens": [51184, 862, 597, 291, 393, 360, 293, 7140, 295, 661, 291, 458, 7733, 484, 456, 307, 257, 2459, 7318, 46558, 51508], "temperature": 0.0, "avg_logprob": -0.20280312050829877, "compression_ratio": 1.6710526315789473, "no_speech_prob": 0.016545310616493225}, {"id": 339, "seek": 79656, "start": 1999.44, "end": 2006.24, "text": " including XAI is right so this is more of that this is like role playing with sexual kind of adults", "tokens": [51508, 3009, 1783, 48698, 307, 558, 370, 341, 307, 544, 295, 300, 341, 307, 411, 3090, 2433, 365, 6701, 733, 295, 8865, 51848], "temperature": 0.0, "avg_logprob": -0.20280312050829877, "compression_ratio": 1.6710526315789473, "no_speech_prob": 0.016545310616493225}, {"id": 340, "seek": 82624, "start": 2006.24, "end": 2014.1599999999999, "text": " only component to it not straight up like explicit content of the sort that you might think", "tokens": [50364, 787, 6542, 281, 309, 406, 2997, 493, 411, 13691, 2701, 295, 264, 1333, 300, 291, 1062, 519, 50760], "temperature": 0.0, "avg_logprob": -0.087874168090606, "compression_ratio": 1.755656108597285, "no_speech_prob": 0.0027143878396600485}, {"id": 341, "seek": 82624, "start": 2014.1599999999999, "end": 2021.2, "text": " when saying porn so in some ways yeah I think there's arguments to be made out of a way like people", "tokens": [50760, 562, 1566, 19444, 370, 294, 512, 2098, 1338, 286, 519, 456, 311, 12869, 281, 312, 1027, 484, 295, 257, 636, 411, 561, 51112], "temperature": 0.0, "avg_logprob": -0.087874168090606, "compression_ratio": 1.755656108597285, "no_speech_prob": 0.0027143878396600485}, {"id": 342, "seek": 82624, "start": 2021.2, "end": 2028.1599999999999, "text": " have tried to do this and tried to do this as is most likely with these models people do seem to", "tokens": [51112, 362, 3031, 281, 360, 341, 293, 3031, 281, 360, 341, 382, 307, 881, 3700, 365, 613, 5245, 561, 360, 1643, 281, 51460], "temperature": 0.0, "avg_logprob": -0.087874168090606, "compression_ratio": 1.755656108597285, "no_speech_prob": 0.0027143878396600485}, {"id": 343, "seek": 82624, "start": 2028.1599999999999, "end": 2035.04, "text": " want it there's very large adoption of these things and we already have plenty of stories of people", "tokens": [51460, 528, 309, 456, 311, 588, 2416, 19215, 295, 613, 721, 293, 321, 1217, 362, 7140, 295, 3676, 295, 561, 51804], "temperature": 0.0, "avg_logprob": -0.087874168090606, "compression_ratio": 1.755656108597285, "no_speech_prob": 0.0027143878396600485}, {"id": 344, "seek": 85504, "start": 2035.76, "end": 2043.52, "text": " getting AI girlfriends and having psychological impacts so yeah maybe actually if chat GPT and", "tokens": [50400, 1242, 7318, 46558, 293, 1419, 14346, 11606, 370, 1338, 1310, 767, 498, 5081, 26039, 51, 293, 50788], "temperature": 0.0, "avg_logprob": -0.1709654630923813, "compression_ratio": 1.6134453781512605, "no_speech_prob": 0.004556582309305668}, {"id": 345, "seek": 85504, "start": 2043.52, "end": 2050.64, "text": " open AI do this explicitly and do it right and do it well it's better than not doing it and having", "tokens": [50788, 1269, 7318, 360, 341, 20803, 293, 360, 309, 558, 293, 360, 309, 731, 309, 311, 1101, 813, 406, 884, 309, 293, 1419, 51144], "temperature": 0.0, "avg_logprob": -0.1709654630923813, "compression_ratio": 1.6134453781512605, "no_speech_prob": 0.004556582309305668}, {"id": 346, "seek": 85504, "start": 2050.64, "end": 2057.92, "text": " some shady dark market for it you know no absolutely and I take back the the use of the term porn", "tokens": [51144, 512, 41853, 2877, 2142, 337, 309, 291, 458, 572, 3122, 293, 286, 747, 646, 264, 264, 764, 295, 264, 1433, 19444, 51508], "temperature": 0.0, "avg_logprob": -0.1709654630923813, "compression_ratio": 1.6134453781512605, "no_speech_prob": 0.004556582309305668}, {"id": 347, "seek": 85504, "start": 2057.92, "end": 2062.56, "text": " here I guess I'm sort of seeing where the the puck is headed and anticipating that next play", "tokens": [51508, 510, 286, 2041, 286, 478, 1333, 295, 2577, 689, 264, 264, 47181, 307, 12798, 293, 40568, 300, 958, 862, 51740], "temperature": 0.0, "avg_logprob": -0.1709654630923813, "compression_ratio": 1.6134453781512605, "no_speech_prob": 0.004556582309305668}, {"id": 348, "seek": 88256, "start": 2062.56, "end": 2066.6400000000003, "text": " this is definitely a step where we need open AI to actually run the studies since they're going to", "tokens": [50364, 341, 307, 2138, 257, 1823, 689, 321, 643, 1269, 7318, 281, 767, 1190, 264, 5313, 1670, 436, 434, 516, 281, 50568], "temperature": 0.0, "avg_logprob": -0.08550087931965078, "compression_ratio": 1.8029850746268656, "no_speech_prob": 0.001477799960412085}, {"id": 349, "seek": 88256, "start": 2066.6400000000003, "end": 2070.88, "text": " have access to the data and no one else will right so like we we need people to look at this I hope", "tokens": [50568, 362, 2105, 281, 264, 1412, 293, 572, 472, 1646, 486, 558, 370, 411, 321, 321, 643, 561, 281, 574, 412, 341, 286, 1454, 50780], "temperature": 0.0, "avg_logprob": -0.08550087931965078, "compression_ratio": 1.8029850746268656, "no_speech_prob": 0.001477799960412085}, {"id": 350, "seek": 88256, "start": 2070.88, "end": 2075.36, "text": " that they will I would imagine this as part of the package here but from a branding standpoint this", "tokens": [50780, 300, 436, 486, 286, 576, 3811, 341, 382, 644, 295, 264, 7372, 510, 457, 490, 257, 27279, 15827, 341, 51004], "temperature": 0.0, "avg_logprob": -0.08550087931965078, "compression_ratio": 1.8029850746268656, "no_speech_prob": 0.001477799960412085}, {"id": 351, "seek": 88256, "start": 2075.36, "end": 2079.44, "text": " is really dicey especially at a time when they're trying to redouble we'll talk about this in a minute", "tokens": [51004, 307, 534, 10313, 88, 2318, 412, 257, 565, 562, 436, 434, 1382, 281, 2182, 33147, 321, 603, 751, 466, 341, 294, 257, 3456, 51208], "temperature": 0.0, "avg_logprob": -0.08550087931965078, "compression_ratio": 1.8029850746268656, "no_speech_prob": 0.001477799960412085}, {"id": 352, "seek": 88256, "start": 2079.44, "end": 2085.2, "text": " but on this whole sort of business productivity focus like that's going to be their big play so you're", "tokens": [51208, 457, 322, 341, 1379, 1333, 295, 1606, 15604, 1879, 411, 300, 311, 516, 281, 312, 641, 955, 862, 370, 291, 434, 51496], "temperature": 0.0, "avg_logprob": -0.08550087931965078, "compression_ratio": 1.8029850746268656, "no_speech_prob": 0.001477799960412085}, {"id": 353, "seek": 88256, "start": 2085.2, "end": 2089.28, "text": " kind of adding that into the mix it's interesting I mean what what this implies about the user base", "tokens": [51496, 733, 295, 5127, 300, 666, 264, 2890, 309, 311, 1880, 286, 914, 437, 437, 341, 18779, 466, 264, 4195, 3096, 51700], "temperature": 0.0, "avg_logprob": -0.08550087931965078, "compression_ratio": 1.8029850746268656, "no_speech_prob": 0.001477799960412085}, {"id": 354, "seek": 90928, "start": 2089.2799999999997, "end": 2095.76, "text": " that they're they're going after is is there ROI here I mean porn is a pretty low margin industry", "tokens": [50364, 300, 436, 434, 436, 434, 516, 934, 307, 307, 456, 49808, 510, 286, 914, 19444, 307, 257, 1238, 2295, 10270, 3518, 50688], "temperature": 0.0, "avg_logprob": -0.09753182899853415, "compression_ratio": 1.7619047619047619, "no_speech_prob": 0.0015171633567661047}, {"id": 355, "seek": 90928, "start": 2095.76, "end": 2100.96, "text": " as I understand it's so so this like I don't know if role play erotic role play would kind of", "tokens": [50688, 382, 286, 1223, 309, 311, 370, 370, 341, 411, 286, 500, 380, 458, 498, 3090, 862, 1189, 9411, 3090, 862, 576, 733, 295, 50948], "temperature": 0.0, "avg_logprob": -0.09753182899853415, "compression_ratio": 1.7619047619047619, "no_speech_prob": 0.0015171633567661047}, {"id": 356, "seek": 90928, "start": 2100.96, "end": 2105.76, "text": " be similar things flirting if you will with that but either way yeah we'll find out I think the", "tokens": [50948, 312, 2531, 721, 45777, 498, 291, 486, 365, 300, 457, 2139, 636, 1338, 321, 603, 915, 484, 286, 519, 264, 51188], "temperature": 0.0, "avg_logprob": -0.09753182899853415, "compression_ratio": 1.7619047619047619, "no_speech_prob": 0.0015171633567661047}, {"id": 357, "seek": 90928, "start": 2105.76, "end": 2111.36, "text": " argument that a lot of people want this is also kind of dicey I mean wow a lot of people want crack", "tokens": [51188, 6770, 300, 257, 688, 295, 561, 528, 341, 307, 611, 733, 295, 10313, 88, 286, 914, 6076, 257, 688, 295, 561, 528, 6226, 51468], "temperature": 0.0, "avg_logprob": -0.09753182899853415, "compression_ratio": 1.7619047619047619, "no_speech_prob": 0.0015171633567661047}, {"id": 358, "seek": 90928, "start": 2111.36, "end": 2116.56, "text": " cocaine like it's a so we got to find a way now everything's out of continue I'm being tongue", "tokens": [51468, 33933, 411, 309, 311, 257, 370, 321, 658, 281, 915, 257, 636, 586, 1203, 311, 484, 295, 2354, 286, 478, 885, 10601, 51728], "temperature": 0.0, "avg_logprob": -0.09753182899853415, "compression_ratio": 1.7619047619047619, "no_speech_prob": 0.0015171633567661047}, {"id": 359, "seek": 93656, "start": 2116.56, "end": 2121.92, "text": " in cheek here obviously but we have infinite inference inference time compute budgets coming online", "tokens": [50364, 294, 12839, 510, 2745, 457, 321, 362, 13785, 38253, 38253, 565, 14722, 26708, 1348, 2950, 50632], "temperature": 0.0, "avg_logprob": -0.10311119061182528, "compression_ratio": 1.7661870503597121, "no_speech_prob": 0.013138500042259693}, {"id": 360, "seek": 93656, "start": 2121.92, "end": 2126.0, "text": " in the next you know three to five years or whatever at what point do you just have so much inference", "tokens": [50632, 294, 264, 958, 291, 458, 1045, 281, 1732, 924, 420, 2035, 412, 437, 935, 360, 291, 445, 362, 370, 709, 38253, 50836], "temperature": 0.0, "avg_logprob": -0.10311119061182528, "compression_ratio": 1.7661870503597121, "no_speech_prob": 0.013138500042259693}, {"id": 361, "seek": 93656, "start": 2126.0, "end": 2131.44, "text": " time computing dedicated to your your limbic system that for all intents and purposes like you're", "tokens": [50836, 565, 15866, 8374, 281, 428, 428, 30390, 299, 1185, 300, 337, 439, 560, 791, 293, 9932, 411, 291, 434, 51108], "temperature": 0.0, "avg_logprob": -0.10311119061182528, "compression_ratio": 1.7661870503597121, "no_speech_prob": 0.013138500042259693}, {"id": 362, "seek": 93656, "start": 2131.44, "end": 2135.84, "text": " robbed of your agency I think we're actually going to be closer to asking ourselves that question", "tokens": [51108, 35772, 295, 428, 7934, 286, 519, 321, 434, 767, 516, 281, 312, 4966, 281, 3365, 4175, 300, 1168, 51328], "temperature": 0.0, "avg_logprob": -0.10311119061182528, "compression_ratio": 1.7661870503597121, "no_speech_prob": 0.013138500042259693}, {"id": 363, "seek": 93656, "start": 2135.84, "end": 2141.12, "text": " sooner rather than later than most people are pricing it right and then just to reiterate the", "tokens": [51328, 15324, 2831, 813, 1780, 813, 881, 561, 366, 17621, 309, 558, 293, 550, 445, 281, 33528, 264, 51592], "temperature": 0.0, "avg_logprob": -0.10311119061182528, "compression_ratio": 1.7661870503597121, "no_speech_prob": 0.013138500042259693}, {"id": 364, "seek": 96112, "start": 2141.12, "end": 2148.64, "text": " actual news here is just that there have been these details coming out of the internal objections", "tokens": [50364, 3539, 2583, 510, 307, 445, 300, 456, 362, 668, 613, 4365, 1348, 484, 295, 264, 6920, 44649, 50740], "temperature": 0.0, "avg_logprob": -0.11740480301280816, "compression_ratio": 1.7511520737327189, "no_speech_prob": 0.007451652083545923}, {"id": 365, "seek": 96112, "start": 2148.64, "end": 2156.8, "text": " from certain parts of open AI and these objections have been there since January and the company", "tokens": [50740, 490, 1629, 3166, 295, 1269, 7318, 293, 613, 44649, 362, 668, 456, 1670, 7061, 293, 264, 2237, 51148], "temperature": 0.0, "avg_logprob": -0.11740480301280816, "compression_ratio": 1.7511520737327189, "no_speech_prob": 0.007451652083545923}, {"id": 366, "seek": 96112, "start": 2156.8, "end": 2164.24, "text": " hasn't canceled this feature features it out they could still not do it and it has been delayed", "tokens": [51148, 6132, 380, 24839, 341, 4111, 4122, 309, 484, 436, 727, 920, 406, 360, 309, 293, 309, 575, 668, 20268, 51520], "temperature": 0.0, "avg_logprob": -0.11740480301280816, "compression_ratio": 1.7511520737327189, "no_speech_prob": 0.007451652083545923}, {"id": 367, "seek": 96112, "start": 2164.24, "end": 2169.04, "text": " so it's still on track and it'll be interesting to see if they do go through it after all", "tokens": [51520, 370, 309, 311, 920, 322, 2837, 293, 309, 603, 312, 1880, 281, 536, 498, 436, 360, 352, 807, 309, 934, 439, 51760], "temperature": 0.0, "avg_logprob": -0.11740480301280816, "compression_ratio": 1.7511520737327189, "no_speech_prob": 0.007451652083545923}, {"id": 368, "seek": 98904, "start": 2169.76, "end": 2174.16, "text": " yeah I know that's true and actually like a lot of my reaction is reactionary I'm like", "tokens": [50400, 1338, 286, 458, 300, 311, 2074, 293, 767, 411, 257, 688, 295, 452, 5480, 307, 5480, 822, 286, 478, 411, 50620], "temperature": 0.0, "avg_logprob": -0.08058271076944139, "compression_ratio": 1.9166666666666667, "no_speech_prob": 0.01402556523680687}, {"id": 369, "seek": 98904, "start": 2174.16, "end": 2179.12, "text": " this this direction generally is definitely coming but it makes me nervous as hell and that's kind", "tokens": [50620, 341, 341, 3513, 5101, 307, 2138, 1348, 457, 309, 1669, 385, 6296, 382, 4921, 293, 300, 311, 733, 50868], "temperature": 0.0, "avg_logprob": -0.08058271076944139, "compression_ratio": 1.9166666666666667, "no_speech_prob": 0.01402556523680687}, {"id": 370, "seek": 98904, "start": 2179.12, "end": 2183.6800000000003, "text": " of that's sort of my my response to this is you know we're just sort of seeing a lot of this is", "tokens": [50868, 295, 300, 311, 1333, 295, 452, 452, 4134, 281, 341, 307, 291, 458, 321, 434, 445, 1333, 295, 2577, 257, 688, 295, 341, 307, 51096], "temperature": 0.0, "avg_logprob": -0.08058271076944139, "compression_ratio": 1.9166666666666667, "no_speech_prob": 0.01402556523680687}, {"id": 371, "seek": 98904, "start": 2183.6800000000003, "end": 2187.76, "text": " testing the waters too I'm not saying this particular story was leaked but you see governments do", "tokens": [51096, 4997, 264, 12975, 886, 286, 478, 406, 1566, 341, 1729, 1657, 390, 31779, 457, 291, 536, 11280, 360, 51300], "temperature": 0.0, "avg_logprob": -0.08058271076944139, "compression_ratio": 1.9166666666666667, "no_speech_prob": 0.01402556523680687}, {"id": 372, "seek": 98904, "start": 2187.76, "end": 2192.4, "text": " this you see private companies do this there's like leaks of stuff to go out to just gauge audience", "tokens": [51300, 341, 291, 536, 4551, 3431, 360, 341, 456, 311, 411, 28885, 295, 1507, 281, 352, 484, 281, 445, 17924, 4034, 51532], "temperature": 0.0, "avg_logprob": -0.08058271076944139, "compression_ratio": 1.9166666666666667, "no_speech_prob": 0.01402556523680687}, {"id": 373, "seek": 98904, "start": 2192.4, "end": 2197.52, "text": " kind of consumer response to things can we take that step can we not I don't know again if this", "tokens": [51532, 733, 295, 9711, 4134, 281, 721, 393, 321, 747, 300, 1823, 393, 321, 406, 286, 500, 380, 458, 797, 498, 341, 51788], "temperature": 0.0, "avg_logprob": -0.08058271076944139, "compression_ratio": 1.9166666666666667, "no_speech_prob": 0.01402556523680687}, {"id": 374, "seek": 101752, "start": 2197.68, "end": 2203.04, "text": " is or is not the case here but but man I mean that you'll at some point need some kind of immune", "tokens": [50372, 307, 420, 307, 406, 264, 1389, 510, 457, 457, 587, 286, 914, 300, 291, 603, 412, 512, 935, 643, 512, 733, 295, 11992, 50640], "temperature": 0.0, "avg_logprob": -0.1089898845728706, "compression_ratio": 1.6858407079646018, "no_speech_prob": 0.0008937243255786598}, {"id": 375, "seek": 101752, "start": 2203.04, "end": 2208.8, "text": " response to things that push in this general direction and on to applications and business", "tokens": [50640, 4134, 281, 721, 300, 2944, 294, 341, 2674, 3513, 293, 322, 281, 5821, 293, 1606, 50928], "temperature": 0.0, "avg_logprob": -0.1089898845728706, "compression_ratio": 1.6858407079646018, "no_speech_prob": 0.0008937243255786598}, {"id": 376, "seek": 101752, "start": 2208.8, "end": 2215.44, "text": " sticking with open AI and coming back to a topic we briefly touched on another thing that came", "tokens": [50928, 13465, 365, 1269, 7318, 293, 1348, 646, 281, 257, 4829, 321, 10515, 9828, 322, 1071, 551, 300, 1361, 51260], "temperature": 0.0, "avg_logprob": -0.1089898845728706, "compression_ratio": 1.6858407079646018, "no_speech_prob": 0.0008937243255786598}, {"id": 377, "seek": 101752, "start": 2215.44, "end": 2223.76, "text": " out from open AI this week is a discussion that happened at an all hands where effectively open AI", "tokens": [51260, 484, 490, 1269, 7318, 341, 1243, 307, 257, 5017, 300, 2011, 412, 364, 439, 2377, 689, 8659, 1269, 7318, 51676], "temperature": 0.0, "avg_logprob": -0.1089898845728706, "compression_ratio": 1.6858407079646018, "no_speech_prob": 0.0008937243255786598}, {"id": 378, "seek": 104376, "start": 2223.76, "end": 2230.7200000000003, "text": " has signaled a strategic shift away from doing everything doing a little bit of everything and", "tokens": [50364, 575, 1465, 5573, 257, 10924, 5513, 1314, 490, 884, 1203, 884, 257, 707, 857, 295, 1203, 293, 50712], "temperature": 0.0, "avg_logprob": -0.11257549979049584, "compression_ratio": 1.6193181818181819, "no_speech_prob": 0.0004769949009642005}, {"id": 379, "seek": 104376, "start": 2230.7200000000003, "end": 2238.48, "text": " focusing more on productivity and business so you know open AI has historically had a million", "tokens": [50712, 8416, 544, 322, 15604, 293, 1606, 370, 291, 458, 1269, 7318, 575, 16180, 632, 257, 2459, 51100], "temperature": 0.0, "avg_logprob": -0.11257549979049584, "compression_ratio": 1.6193181818181819, "no_speech_prob": 0.0004769949009642005}, {"id": 380, "seek": 104376, "start": 2238.48, "end": 2244.24, "text": " side projects this I think is how they characterize it side projects they have video models they", "tokens": [51100, 1252, 4455, 341, 286, 519, 307, 577, 436, 38463, 309, 1252, 4455, 436, 362, 960, 5245, 436, 51388], "temperature": 0.0, "avg_logprob": -0.11257549979049584, "compression_ratio": 1.6193181818181819, "no_speech_prob": 0.0004769949009642005}, {"id": 381, "seek": 106424, "start": 2244.24, "end": 2252.16, "text": " release the soar app they have a browser they have audio models I believe transcription like", "tokens": [50364, 4374, 264, 370, 289, 724, 436, 362, 257, 11185, 436, 362, 6278, 5245, 286, 1697, 35288, 411, 50760], "temperature": 0.0, "avg_logprob": -0.2574842558665709, "compression_ratio": 1.6568047337278107, "no_speech_prob": 0.0021160084288567305}, {"id": 382, "seek": 106424, "start": 2252.7200000000003, "end": 2261.12, "text": " a million things on fabric has clawed and that's it open AI has sorah and their audio models and", "tokens": [50788, 257, 2459, 721, 322, 7253, 575, 32019, 292, 293, 300, 311, 309, 1269, 7318, 575, 370, 15688, 293, 641, 6278, 5245, 293, 51208], "temperature": 0.0, "avg_logprob": -0.2574842558665709, "compression_ratio": 1.6568047337278107, "no_speech_prob": 0.0021160084288567305}, {"id": 383, "seek": 106424, "start": 2261.12, "end": 2269.76, "text": " and so on and so on so it seems to be that internally within the company they are changing", "tokens": [51208, 293, 370, 322, 293, 370, 322, 370, 309, 2544, 281, 312, 300, 19501, 1951, 264, 2237, 436, 366, 4473, 51640], "temperature": 0.0, "avg_logprob": -0.2574842558665709, "compression_ratio": 1.6568047337278107, "no_speech_prob": 0.0021160084288567305}, {"id": 384, "seek": 108976, "start": 2269.76, "end": 2276.6400000000003, "text": " focus the head of applications Fiji Simo has told staff that they cannot miss this moment because", "tokens": [50364, 1879, 264, 1378, 295, 5821, 479, 26539, 318, 6934, 575, 1907, 3525, 300, 436, 2644, 1713, 341, 1623, 570, 50708], "temperature": 0.0, "avg_logprob": -0.1761720676061719, "compression_ratio": 1.6266094420600858, "no_speech_prob": 0.009287931956350803}, {"id": 385, "seek": 108976, "start": 2276.6400000000003, "end": 2283.6, "text": " they are distracted by side quests and they need to nail productivity and you've seen this already", "tokens": [50708, 436, 366, 21658, 538, 1252, 34247, 293, 436, 643, 281, 10173, 15604, 293, 291, 600, 1612, 341, 1217, 51056], "temperature": 0.0, "avg_logprob": -0.1761720676061719, "compression_ratio": 1.6266094420600858, "no_speech_prob": 0.009287931956350803}, {"id": 386, "seek": 108976, "start": 2283.6, "end": 2288.56, "text": " for the last several months it has been apparent that they have been really pushing on codex", "tokens": [51056, 337, 264, 1036, 2940, 2493, 309, 575, 668, 18335, 300, 436, 362, 668, 534, 7380, 322, 3089, 87, 51304], "temperature": 0.0, "avg_logprob": -0.1761720676061719, "compression_ratio": 1.6266094420600858, "no_speech_prob": 0.009287931956350803}, {"id": 387, "seek": 108976, "start": 2288.56, "end": 2294.08, "text": " to catch up with cloud and and co-work as well and they largely have at least in terms of", "tokens": [51304, 281, 3745, 493, 365, 4588, 293, 293, 598, 12, 1902, 382, 731, 293, 436, 11611, 362, 412, 1935, 294, 2115, 295, 51580], "temperature": 0.0, "avg_logprob": -0.1761720676061719, "compression_ratio": 1.6266094420600858, "no_speech_prob": 0.009287931956350803}, {"id": 388, "seek": 111408, "start": 2294.64, "end": 2301.2799999999997, "text": " feature set in terms of adoption I mean many people are starting to use codex some people prefer", "tokens": [50392, 4111, 992, 294, 2115, 295, 19215, 286, 914, 867, 561, 366, 2891, 281, 764, 3089, 87, 512, 561, 4382, 50724], "temperature": 0.0, "avg_logprob": -0.09185419471136161, "compression_ratio": 1.7363636363636363, "no_speech_prob": 0.004530020523816347}, {"id": 389, "seek": 111408, "start": 2301.2799999999997, "end": 2308.3199999999997, "text": " codex at this point but it is obviously true that this hasn't been open AI is primarily focused", "tokens": [50724, 3089, 87, 412, 341, 935, 457, 309, 307, 2745, 2074, 300, 341, 6132, 380, 668, 1269, 7318, 307, 10029, 5178, 51076], "temperature": 0.0, "avg_logprob": -0.09185419471136161, "compression_ratio": 1.7363636363636363, "no_speech_prob": 0.004530020523816347}, {"id": 390, "seek": 111408, "start": 2308.3199999999997, "end": 2314.64, "text": " until recently and they have kind of been behind the cloud and in terms of adoption they are", "tokens": [51076, 1826, 3938, 293, 436, 362, 733, 295, 668, 2261, 264, 4588, 293, 294, 2115, 295, 19215, 436, 366, 51392], "temperature": 0.0, "avg_logprob": -0.09185419471136161, "compression_ratio": 1.7363636363636363, "no_speech_prob": 0.004530020523816347}, {"id": 391, "seek": 111408, "start": 2314.64, "end": 2322.3199999999997, "text": " still behind because cloud code is the first early leader in the category so yeah interesting to", "tokens": [51392, 920, 2261, 570, 4588, 3089, 307, 264, 700, 2440, 5263, 294, 264, 7719, 370, 1338, 1880, 281, 51776], "temperature": 0.0, "avg_logprob": -0.09185419471136161, "compression_ratio": 1.7363636363636363, "no_speech_prob": 0.004530020523816347}, {"id": 392, "seek": 114232, "start": 2322.3199999999997, "end": 2328.48, "text": " see open AI having that internal discussion now I feel like this has been a problem with open AI", "tokens": [50364, 536, 1269, 7318, 1419, 300, 6920, 5017, 586, 286, 841, 411, 341, 575, 668, 257, 1154, 365, 1269, 7318, 50672], "temperature": 0.0, "avg_logprob": -0.10991791832721097, "compression_ratio": 1.6828193832599119, "no_speech_prob": 0.004033009521663189}, {"id": 393, "seek": 114232, "start": 2328.48, "end": 2336.0, "text": " for a while if I were to kind of guess at internal dynamics and kind of in business and company", "tokens": [50672, 337, 257, 1339, 498, 286, 645, 281, 733, 295, 2041, 412, 6920, 15679, 293, 733, 295, 294, 1606, 293, 2237, 51048], "temperature": 0.0, "avg_logprob": -0.10991791832721097, "compression_ratio": 1.6828193832599119, "no_speech_prob": 0.004033009521663189}, {"id": 394, "seek": 114232, "start": 2336.0, "end": 2344.8, "text": " level issues that lead to poor performance so yeah kind of could see this coming I think famously", "tokens": [51048, 1496, 2663, 300, 1477, 281, 4716, 3389, 370, 1338, 733, 295, 727, 536, 341, 1348, 286, 519, 34360, 51488], "temperature": 0.0, "avg_logprob": -0.10991791832721097, "compression_ratio": 1.6828193832599119, "no_speech_prob": 0.004033009521663189}, {"id": 395, "seek": 114232, "start": 2344.8, "end": 2348.88, "text": " and and Andre I'm sure you'll have friends that have told you the same thing like at Google", "tokens": [51488, 293, 293, 20667, 286, 478, 988, 291, 603, 362, 1855, 300, 362, 1907, 291, 264, 912, 551, 411, 412, 3329, 51692], "temperature": 0.0, "avg_logprob": -0.10991791832721097, "compression_ratio": 1.6828193832599119, "no_speech_prob": 0.004033009521663189}, {"id": 396, "seek": 116888, "start": 2348.96, "end": 2353.04, "text": " everybody in mind who's ever worked at Google says the same thing and actually same meta you get", "tokens": [50368, 2201, 294, 1575, 567, 311, 1562, 2732, 412, 3329, 1619, 264, 912, 551, 293, 767, 912, 19616, 291, 483, 50572], "temperature": 0.0, "avg_logprob": -0.12619741421479444, "compression_ratio": 1.8877887788778878, "no_speech_prob": 0.015324460342526436}, {"id": 397, "seek": 116888, "start": 2353.04, "end": 2357.6800000000003, "text": " promoted for building new stuff right it's like it's not about did you make the code run more", "tokens": [50572, 21162, 337, 2390, 777, 1507, 558, 309, 311, 411, 309, 311, 406, 466, 630, 291, 652, 264, 3089, 1190, 544, 50804], "temperature": 0.0, "avg_logprob": -0.12619741421479444, "compression_ratio": 1.8877887788778878, "no_speech_prob": 0.015324460342526436}, {"id": 398, "seek": 116888, "start": 2357.6800000000003, "end": 2362.24, "text": " efficiently did you clean this up just clean that up it's like did you make new stuff and that's", "tokens": [50804, 19621, 630, 291, 2541, 341, 493, 445, 2541, 300, 493, 309, 311, 411, 630, 291, 652, 777, 1507, 293, 300, 311, 51032], "temperature": 0.0, "avg_logprob": -0.12619741421479444, "compression_ratio": 1.8877887788778878, "no_speech_prob": 0.015324460342526436}, {"id": 399, "seek": 116888, "start": 2362.24, "end": 2367.04, "text": " why there's a massive app grade of yard right famously for Google products you know Google hangouts", "tokens": [51032, 983, 456, 311, 257, 5994, 724, 7204, 295, 11682, 558, 34360, 337, 3329, 3383, 291, 458, 3329, 3967, 7711, 51272], "temperature": 0.0, "avg_logprob": -0.12619741421479444, "compression_ratio": 1.8877887788778878, "no_speech_prob": 0.015324460342526436}, {"id": 400, "seek": 116888, "start": 2367.04, "end": 2372.24, "text": " Google this Google that that just like gets axed in at various stages there's this fundamental", "tokens": [51272, 3329, 341, 3329, 300, 300, 445, 411, 2170, 6360, 292, 294, 412, 3683, 10232, 456, 311, 341, 8088, 51532], "temperature": 0.0, "avg_logprob": -0.12619741421479444, "compression_ratio": 1.8877887788778878, "no_speech_prob": 0.015324460342526436}, {"id": 401, "seek": 0, "start": 2371.56, "end": 2372.88, "text": " There's this fundamental question of like,", "tokens": [50942, 821, 311, 341, 8088, 1168, 295, 411, 11, 51008], "temperature": 0.0, "avg_logprob": -0.2052779491116971, "compression_ratio": 1.8969072164948453, "no_speech_prob": 0.0822889506816864}, {"id": 402, "seek": 0, "start": 2372.88, "end": 2374.88, "text": " again, is this a feature bug, right?", "tokens": [51008, 797, 11, 307, 341, 257, 4111, 7426, 11, 558, 30, 51108], "temperature": 0.0, "avg_logprob": -0.2052779491116971, "compression_ratio": 1.8969072164948453, "no_speech_prob": 0.0822889506816864}, {"id": 403, "seek": 0, "start": 2374.88, "end": 2376.48, "text": " You can look at Google and you can say,", "tokens": [51108, 509, 393, 574, 412, 3329, 293, 291, 393, 584, 11, 51188], "temperature": 0.0, "avg_logprob": -0.2052779491116971, "compression_ratio": 1.8969072164948453, "no_speech_prob": 0.0822889506816864}, {"id": 404, "seek": 0, "start": 2376.48, "end": 2378.84, "text": " ha ha, look at the grade of yard of waits a time.", "tokens": [51188, 324, 324, 11, 574, 412, 264, 7204, 295, 11682, 295, 40597, 257, 565, 13, 51306], "temperature": 0.0, "avg_logprob": -0.2052779491116971, "compression_ratio": 1.8969072164948453, "no_speech_prob": 0.0822889506816864}, {"id": 405, "seek": 0, "start": 2378.84, "end": 2379.96, "text": " You can also look at Google and say,", "tokens": [51306, 509, 393, 611, 574, 412, 3329, 293, 584, 11, 51362], "temperature": 0.0, "avg_logprob": -0.2052779491116971, "compression_ratio": 1.8969072164948453, "no_speech_prob": 0.0822889506816864}, {"id": 406, "seek": 0, "start": 2379.96, "end": 2382.12, "text": " well, what matters is not the misses,", "tokens": [51362, 731, 11, 437, 7001, 307, 406, 264, 29394, 11, 51470], "temperature": 0.0, "avg_logprob": -0.2052779491116971, "compression_ratio": 1.8969072164948453, "no_speech_prob": 0.0822889506816864}, {"id": 407, "seek": 0, "start": 2382.12, "end": 2383.44, "text": " what matters is the hits.", "tokens": [51470, 437, 7001, 307, 264, 8664, 13, 51536], "temperature": 0.0, "avg_logprob": -0.2052779491116971, "compression_ratio": 1.8969072164948453, "no_speech_prob": 0.0822889506816864}, {"id": 408, "seek": 0, "start": 2383.44, "end": 2386.44, "text": " And for every, not for every Google Hangouts", "tokens": [51536, 400, 337, 633, 11, 406, 337, 633, 3329, 14070, 7711, 51686], "temperature": 0.0, "avg_logprob": -0.2052779491116971, "compression_ratio": 1.8969072164948453, "no_speech_prob": 0.0822889506816864}, {"id": 409, "seek": 0, "start": 2386.44, "end": 2387.64, "text": " or dead Google product,", "tokens": [51686, 420, 3116, 3329, 1674, 11, 51746], "temperature": 0.0, "avg_logprob": -0.2052779491116971, "compression_ratio": 1.8969072164948453, "no_speech_prob": 0.0822889506816864}, {"id": 410, "seek": 2764, "start": 2387.64, "end": 2390.4, "text": " there's like a Google calendar or a Gmail, right?", "tokens": [50364, 456, 311, 411, 257, 3329, 12183, 420, 257, 36732, 11, 558, 30, 50502], "temperature": 0.0, "avg_logprob": -0.18560858701284116, "compression_ratio": 1.7648809523809523, "no_speech_prob": 0.0038387966342270374}, {"id": 411, "seek": 2764, "start": 2390.4, "end": 2391.84, "text": " Or maps, right?", "tokens": [50502, 1610, 11317, 11, 558, 30, 50574], "temperature": 0.0, "avg_logprob": -0.18560858701284116, "compression_ratio": 1.7648809523809523, "no_speech_prob": 0.0038387966342270374}, {"id": 412, "seek": 2764, "start": 2391.84, "end": 2393.6, "text": " Or maps, that's right, that's right.", "tokens": [50574, 1610, 11317, 11, 300, 311, 558, 11, 300, 311, 558, 13, 50662], "temperature": 0.0, "avg_logprob": -0.18560858701284116, "compression_ratio": 1.7648809523809523, "no_speech_prob": 0.0038387966342270374}, {"id": 413, "seek": 2764, "start": 2393.6, "end": 2396.76, "text": " So like Paul Buhheit, famously, you know, now a YC partner,", "tokens": [50662, 407, 411, 4552, 363, 3232, 8480, 11, 34360, 11, 291, 458, 11, 586, 257, 398, 34, 4975, 11, 50820], "temperature": 0.0, "avg_logprob": -0.18560858701284116, "compression_ratio": 1.7648809523809523, "no_speech_prob": 0.0038387966342270374}, {"id": 414, "seek": 2764, "start": 2396.76, "end": 2398.4, "text": " I don't know if he's left, YC, he was there when I was there.", "tokens": [50820, 286, 500, 380, 458, 498, 415, 311, 1411, 11, 398, 34, 11, 415, 390, 456, 562, 286, 390, 456, 13, 50902], "temperature": 0.0, "avg_logprob": -0.18560858701284116, "compression_ratio": 1.7648809523809523, "no_speech_prob": 0.0038387966342270374}, {"id": 415, "seek": 2764, "start": 2398.4, "end": 2400.56, "text": " But he like was the founder of Gmail", "tokens": [50902, 583, 415, 411, 390, 264, 14917, 295, 36732, 51010], "temperature": 0.0, "avg_logprob": -0.18560858701284116, "compression_ratio": 1.7648809523809523, "no_speech_prob": 0.0038387966342270374}, {"id": 416, "seek": 2764, "start": 2400.56, "end": 2403.84, "text": " and his entire like claim to fame and Silicon Valley", "tokens": [51010, 293, 702, 2302, 411, 3932, 281, 16874, 293, 25351, 10666, 51174], "temperature": 0.0, "avg_logprob": -0.18560858701284116, "compression_ratio": 1.7648809523809523, "no_speech_prob": 0.0038387966342270374}, {"id": 417, "seek": 2764, "start": 2403.84, "end": 2405.0, "text": " is that he was the Gmail guy.", "tokens": [51174, 307, 300, 415, 390, 264, 36732, 2146, 13, 51232], "temperature": 0.0, "avg_logprob": -0.18560858701284116, "compression_ratio": 1.7648809523809523, "no_speech_prob": 0.0038387966342270374}, {"id": 418, "seek": 2764, "start": 2405.0, "end": 2406.92, "text": " That was a company within Google.", "tokens": [51232, 663, 390, 257, 2237, 1951, 3329, 13, 51328], "temperature": 0.0, "avg_logprob": -0.18560858701284116, "compression_ratio": 1.7648809523809523, "no_speech_prob": 0.0038387966342270374}, {"id": 419, "seek": 2764, "start": 2406.92, "end": 2408.08, "text": " That's the way to think of it.", "tokens": [51328, 663, 311, 264, 636, 281, 519, 295, 309, 13, 51386], "temperature": 0.0, "avg_logprob": -0.18560858701284116, "compression_ratio": 1.7648809523809523, "no_speech_prob": 0.0038387966342270374}, {"id": 420, "seek": 2764, "start": 2408.08, "end": 2409.28, "text": " And certainly from a revenue standpoint,", "tokens": [51386, 400, 3297, 490, 257, 9324, 15827, 11, 51446], "temperature": 0.0, "avg_logprob": -0.18560858701284116, "compression_ratio": 1.7648809523809523, "no_speech_prob": 0.0038387966342270374}, {"id": 421, "seek": 2764, "start": 2409.28, "end": 2410.4, "text": " that's what it is, right?", "tokens": [51446, 300, 311, 437, 309, 307, 11, 558, 30, 51502], "temperature": 0.0, "avg_logprob": -0.18560858701284116, "compression_ratio": 1.7648809523809523, "no_speech_prob": 0.0038387966342270374}, {"id": 422, "seek": 2764, "start": 2410.4, "end": 2414.48, "text": " So when Sam, who comes out of the YC ecosystem too, right,", "tokens": [51502, 407, 562, 4832, 11, 567, 1487, 484, 295, 264, 398, 34, 11311, 886, 11, 558, 11, 51706], "temperature": 0.0, "avg_logprob": -0.18560858701284116, "compression_ratio": 1.7648809523809523, "no_speech_prob": 0.0038387966342270374}, {"id": 423, "seek": 2764, "start": 2414.48, "end": 2417.12, "text": " having been the president of Y Combinator for many years,", "tokens": [51706, 1419, 668, 264, 3868, 295, 398, 2432, 13496, 1639, 337, 867, 924, 11, 51838], "temperature": 0.0, "avg_logprob": -0.18560858701284116, "compression_ratio": 1.7648809523809523, "no_speech_prob": 0.0038387966342270374}, {"id": 424, "seek": 5712, "start": 2417.12, "end": 2418.64, "text": " the first one after Paul Graham,", "tokens": [50364, 264, 700, 472, 934, 4552, 22691, 11, 50440], "temperature": 0.0, "avg_logprob": -0.15306040907056076, "compression_ratio": 1.7453416149068324, "no_speech_prob": 0.001817027572542429}, {"id": 425, "seek": 5712, "start": 2418.64, "end": 2420.96, "text": " his whole thing is spray and pray, right?", "tokens": [50440, 702, 1379, 551, 307, 8519, 293, 3690, 11, 558, 30, 50556], "temperature": 0.0, "avg_logprob": -0.15306040907056076, "compression_ratio": 1.7453416149068324, "no_speech_prob": 0.001817027572542429}, {"id": 426, "seek": 5712, "start": 2420.96, "end": 2423.8, "text": " He's used to betting on us like a little bit of money", "tokens": [50556, 634, 311, 1143, 281, 34246, 322, 505, 411, 257, 707, 857, 295, 1460, 50698], "temperature": 0.0, "avg_logprob": -0.15306040907056076, "compression_ratio": 1.7453416149068324, "no_speech_prob": 0.001817027572542429}, {"id": 427, "seek": 5712, "start": 2423.8, "end": 2425.6, "text": " on a lot of efforts at the same time.", "tokens": [50698, 322, 257, 688, 295, 6484, 412, 264, 912, 565, 13, 50788], "temperature": 0.0, "avg_logprob": -0.15306040907056076, "compression_ratio": 1.7453416149068324, "no_speech_prob": 0.001817027572542429}, {"id": 428, "seek": 5712, "start": 2425.6, "end": 2428.72, "text": " This goes way back to OpenAI's founding days, 2015-ish,", "tokens": [50788, 639, 1709, 636, 646, 281, 7238, 48698, 311, 22223, 1708, 11, 7546, 12, 742, 11, 50944], "temperature": 0.0, "avg_logprob": -0.15306040907056076, "compression_ratio": 1.7453416149068324, "no_speech_prob": 0.001817027572542429}, {"id": 429, "seek": 5712, "start": 2428.72, "end": 2430.36, "text": " where he was doing the spray and pray thing", "tokens": [50944, 689, 415, 390, 884, 264, 8519, 293, 3690, 551, 51026], "temperature": 0.0, "avg_logprob": -0.15306040907056076, "compression_ratio": 1.7453416149068324, "no_speech_prob": 0.001817027572542429}, {"id": 430, "seek": 5712, "start": 2430.36, "end": 2432.12, "text": " on a whole bunch of different things, you know,", "tokens": [51026, 322, 257, 1379, 3840, 295, 819, 721, 11, 291, 458, 11, 51114], "temperature": 0.0, "avg_logprob": -0.15306040907056076, "compression_ratio": 1.7453416149068324, "no_speech_prob": 0.001817027572542429}, {"id": 431, "seek": 5712, "start": 2432.12, "end": 2436.24, "text": " evolutionary methods and robotics and RL and all this stuff.", "tokens": [51114, 27567, 7150, 293, 34145, 293, 497, 43, 293, 439, 341, 1507, 13, 51320], "temperature": 0.0, "avg_logprob": -0.15306040907056076, "compression_ratio": 1.7453416149068324, "no_speech_prob": 0.001817027572542429}, {"id": 432, "seek": 5712, "start": 2436.24, "end": 2439.04, "text": " You know, OpenAI's gym was originally a side project, right?", "tokens": [51320, 509, 458, 11, 7238, 48698, 311, 9222, 390, 7993, 257, 1252, 1716, 11, 558, 30, 51460], "temperature": 0.0, "avg_logprob": -0.15306040907056076, "compression_ratio": 1.7453416149068324, "no_speech_prob": 0.001817027572542429}, {"id": 433, "seek": 5712, "start": 2439.04, "end": 2440.08, "text": " Now it's a big thing.", "tokens": [51460, 823, 309, 311, 257, 955, 551, 13, 51512], "temperature": 0.0, "avg_logprob": -0.15306040907056076, "compression_ratio": 1.7453416149068324, "no_speech_prob": 0.001817027572542429}, {"id": 434, "seek": 5712, "start": 2440.08, "end": 2442.8, "text": " So there's a whole bunch of different plays", "tokens": [51512, 407, 456, 311, 257, 1379, 3840, 295, 819, 5749, 51648], "temperature": 0.0, "avg_logprob": -0.15306040907056076, "compression_ratio": 1.7453416149068324, "no_speech_prob": 0.001817027572542429}, {"id": 435, "seek": 5712, "start": 2442.8, "end": 2445.52, "text": " that you know, he sort of like used to playing in this way.", "tokens": [51648, 300, 291, 458, 11, 415, 1333, 295, 411, 1143, 281, 2433, 294, 341, 636, 13, 51784], "temperature": 0.0, "avg_logprob": -0.15306040907056076, "compression_ratio": 1.7453416149068324, "no_speech_prob": 0.001817027572542429}, {"id": 436, "seek": 8552, "start": 2445.52, "end": 2447.12, "text": " It's worked from the past.", "tokens": [50364, 467, 311, 2732, 490, 264, 1791, 13, 50444], "temperature": 0.0, "avg_logprob": -0.150064372950617, "compression_ratio": 1.6266666666666667, "no_speech_prob": 0.0005956983659416437}, {"id": 437, "seek": 8552, "start": 2447.12, "end": 2453.12, "text": " The challenge is now you're entering a much more mature environment, right?", "tokens": [50444, 440, 3430, 307, 586, 291, 434, 11104, 257, 709, 544, 14442, 2823, 11, 558, 30, 50744], "temperature": 0.0, "avg_logprob": -0.150064372950617, "compression_ratio": 1.6266666666666667, "no_speech_prob": 0.0005956983659416437}, {"id": 438, "seek": 8552, "start": 2453.12, "end": 2455.76, "text": " You're no longer necessarily in the game of betting", "tokens": [50744, 509, 434, 572, 2854, 4725, 294, 264, 1216, 295, 34246, 50876], "temperature": 0.0, "avg_logprob": -0.150064372950617, "compression_ratio": 1.6266666666666667, "no_speech_prob": 0.0005956983659416437}, {"id": 439, "seek": 8552, "start": 2455.76, "end": 2456.92, "text": " on a bunch of different startups.", "tokens": [50876, 322, 257, 3840, 295, 819, 28041, 13, 50934], "temperature": 0.0, "avg_logprob": -0.150064372950617, "compression_ratio": 1.6266666666666667, "no_speech_prob": 0.0005956983659416437}, {"id": 440, "seek": 8552, "start": 2456.92, "end": 2460.92, "text": " You have a kind of core area that you need to dominate right now.", "tokens": [50934, 509, 362, 257, 733, 295, 4965, 1859, 300, 291, 643, 281, 28246, 558, 586, 13, 51134], "temperature": 0.0, "avg_logprob": -0.150064372950617, "compression_ratio": 1.6266666666666667, "no_speech_prob": 0.0005956983659416437}, {"id": 441, "seek": 8552, "start": 2460.92, "end": 2463.36, "text": " You know, anthropic is running away with the pod.", "tokens": [51134, 509, 458, 11, 22727, 299, 307, 2614, 1314, 365, 264, 2497, 13, 51256], "temperature": 0.0, "avg_logprob": -0.150064372950617, "compression_ratio": 1.6266666666666667, "no_speech_prob": 0.0005956983659416437}, {"id": 442, "seek": 8552, "start": 2463.36, "end": 2466.56, "text": " I think it's over 70% market share right now in enterprise.", "tokens": [51256, 286, 519, 309, 311, 670, 5285, 4, 2142, 2073, 558, 586, 294, 14132, 13, 51416], "temperature": 0.0, "avg_logprob": -0.150064372950617, "compression_ratio": 1.6266666666666667, "no_speech_prob": 0.0005956983659416437}, {"id": 443, "seek": 8552, "start": 2466.56, "end": 2468.76, "text": " So that's what is the red alert here", "tokens": [51416, 407, 300, 311, 437, 307, 264, 2182, 9615, 510, 51526], "temperature": 0.0, "avg_logprob": -0.150064372950617, "compression_ratio": 1.6266666666666667, "no_speech_prob": 0.0005956983659416437}, {"id": 444, "seek": 8552, "start": 2468.76, "end": 2471.08, "text": " that Sam Altman is calling in Fiji Simo.", "tokens": [51526, 300, 4832, 15992, 1601, 307, 5141, 294, 479, 26539, 318, 6934, 13, 51642], "temperature": 0.0, "avg_logprob": -0.150064372950617, "compression_ratio": 1.6266666666666667, "no_speech_prob": 0.0005956983659416437}, {"id": 445, "seek": 8552, "start": 2471.08, "end": 2473.56, "text": " So they've got to find a way to kind of like,", "tokens": [51642, 407, 436, 600, 658, 281, 915, 257, 636, 281, 733, 295, 411, 11, 51766], "temperature": 0.0, "avg_logprob": -0.150064372950617, "compression_ratio": 1.6266666666666667, "no_speech_prob": 0.0005956983659416437}, {"id": 446, "seek": 11356, "start": 2473.56, "end": 2477.92, "text": " yes, you got to keep that Google or that YC kind of approach", "tokens": [50364, 2086, 11, 291, 658, 281, 1066, 300, 3329, 420, 300, 398, 34, 733, 295, 3109, 50582], "temperature": 0.0, "avg_logprob": -0.1495537296133606, "compression_ratio": 1.673529411764706, "no_speech_prob": 0.0002603476168587804}, {"id": 447, "seek": 11356, "start": 2477.92, "end": 2479.56, "text": " to investing a whole bunch of things.", "tokens": [50582, 281, 10978, 257, 1379, 3840, 295, 721, 13, 50664], "temperature": 0.0, "avg_logprob": -0.1495537296133606, "compression_ratio": 1.673529411764706, "no_speech_prob": 0.0002603476168587804}, {"id": 448, "seek": 11356, "start": 2479.56, "end": 2481.96, "text": " You never know when the next big thing is going to come from.", "tokens": [50664, 509, 1128, 458, 562, 264, 958, 955, 551, 307, 516, 281, 808, 490, 13, 50784], "temperature": 0.0, "avg_logprob": -0.1495537296133606, "compression_ratio": 1.673529411764706, "no_speech_prob": 0.0002603476168587804}, {"id": 449, "seek": 11356, "start": 2481.96, "end": 2485.16, "text": " But you also have to be able to invest and double down", "tokens": [50784, 583, 291, 611, 362, 281, 312, 1075, 281, 1963, 293, 3834, 760, 50944], "temperature": 0.0, "avg_logprob": -0.1495537296133606, "compression_ratio": 1.673529411764706, "no_speech_prob": 0.0002603476168587804}, {"id": 450, "seek": 11356, "start": 2485.16, "end": 2489.0, "text": " in things that are working that are generating a lot of your revenue.", "tokens": [50944, 294, 721, 300, 366, 1364, 300, 366, 17746, 257, 688, 295, 428, 9324, 13, 51136], "temperature": 0.0, "avg_logprob": -0.1495537296133606, "compression_ratio": 1.673529411764706, "no_speech_prob": 0.0002603476168587804}, {"id": 451, "seek": 11356, "start": 2489.0, "end": 2492.84, "text": " I mean, 25% market share on enterprise is a big, big deal.", "tokens": [51136, 286, 914, 11, 3552, 4, 2142, 2073, 322, 14132, 307, 257, 955, 11, 955, 2028, 13, 51328], "temperature": 0.0, "avg_logprob": -0.1495537296133606, "compression_ratio": 1.673529411764706, "no_speech_prob": 0.0002603476168587804}, {"id": 452, "seek": 11356, "start": 2492.84, "end": 2496.56, "text": " You know, they're making what, $25 billion of annualized revenue", "tokens": [51328, 509, 458, 11, 436, 434, 1455, 437, 11, 1848, 6074, 5218, 295, 9784, 1602, 9324, 51514], "temperature": 0.0, "avg_logprob": -0.1495537296133606, "compression_ratio": 1.673529411764706, "no_speech_prob": 0.0002603476168587804}, {"id": 453, "seek": 11356, "start": 2496.56, "end": 2497.56, "text": " at this point.", "tokens": [51514, 412, 341, 935, 13, 51564], "temperature": 0.0, "avg_logprob": -0.1495537296133606, "compression_ratio": 1.673529411764706, "no_speech_prob": 0.0002603476168587804}, {"id": 454, "seek": 11356, "start": 2497.56, "end": 2498.8, "text": " Like they have stuff that's really working.", "tokens": [51564, 1743, 436, 362, 1507, 300, 311, 534, 1364, 13, 51626], "temperature": 0.0, "avg_logprob": -0.1495537296133606, "compression_ratio": 1.673529411764706, "no_speech_prob": 0.0002603476168587804}, {"id": 455, "seek": 11356, "start": 2498.8, "end": 2500.56, "text": " And so this is a structural shift.", "tokens": [51626, 400, 370, 341, 307, 257, 15067, 5513, 13, 51714], "temperature": 0.0, "avg_logprob": -0.1495537296133606, "compression_ratio": 1.673529411764706, "no_speech_prob": 0.0002603476168587804}, {"id": 456, "seek": 11356, "start": 2500.56, "end": 2502.84, "text": " There's another piece that's happening here where I think there's", "tokens": [51714, 821, 311, 1071, 2522, 300, 311, 2737, 510, 689, 286, 519, 456, 311, 51828], "temperature": 0.0, "avg_logprob": -0.1495537296133606, "compression_ratio": 1.673529411764706, "no_speech_prob": 0.0002603476168587804}, {"id": 457, "seek": 14284, "start": 2502.84, "end": 2506.0, "text": " an over rotation on, oh, well, opening eyes like, you know,", "tokens": [50364, 364, 670, 12447, 322, 11, 1954, 11, 731, 11, 5193, 2575, 411, 11, 291, 458, 11, 50522], "temperature": 0.0, "avg_logprob": -0.1550657232833463, "compression_ratio": 1.7705382436260624, "no_speech_prob": 0.0013257600367069244}, {"id": 458, "seek": 14284, "start": 2506.0, "end": 2508.36, "text": " kind of throwing out all these side projects.", "tokens": [50522, 733, 295, 10238, 484, 439, 613, 1252, 4455, 13, 50640], "temperature": 0.0, "avg_logprob": -0.1550657232833463, "compression_ratio": 1.7705382436260624, "no_speech_prob": 0.0013257600367069244}, {"id": 459, "seek": 14284, "start": 2508.36, "end": 2510.88, "text": " Sam wrote on X a couple of days ago.", "tokens": [50640, 4832, 4114, 322, 1783, 257, 1916, 295, 1708, 2057, 13, 50766], "temperature": 0.0, "avg_logprob": -0.1550657232833463, "compression_ratio": 1.7705382436260624, "no_speech_prob": 0.0013257600367069244}, {"id": 460, "seek": 14284, "start": 2510.88, "end": 2512.48, "text": " He's like, look, there's all these rumors", "tokens": [50766, 634, 311, 411, 11, 574, 11, 456, 311, 439, 613, 21201, 50846], "temperature": 0.0, "avg_logprob": -0.1550657232833463, "compression_ratio": 1.7705382436260624, "no_speech_prob": 0.0013257600367069244}, {"id": 461, "seek": 14284, "start": 2512.48, "end": 2515.0, "text": " that we've also canceled the hardware thing with Johnny Ive.", "tokens": [50846, 300, 321, 600, 611, 24839, 264, 8837, 551, 365, 15999, 286, 303, 13, 50972], "temperature": 0.0, "avg_logprob": -0.1550657232833463, "compression_ratio": 1.7705382436260624, "no_speech_prob": 0.0013257600367069244}, {"id": 462, "seek": 14284, "start": 2515.0, "end": 2517.6, "text": " And in fact, this article sort of repeats that rumor", "tokens": [50972, 400, 294, 1186, 11, 341, 7222, 1333, 295, 35038, 300, 29639, 51102], "temperature": 0.0, "avg_logprob": -0.1550657232833463, "compression_ratio": 1.7705382436260624, "no_speech_prob": 0.0013257600367069244}, {"id": 463, "seek": 14284, "start": 2517.6, "end": 2519.8, "text": " saying that, hey, they, you know, they basically canceled the roll", "tokens": [51102, 1566, 300, 11, 4177, 11, 436, 11, 291, 458, 11, 436, 1936, 24839, 264, 3373, 51212], "temperature": 0.0, "avg_logprob": -0.1550657232833463, "compression_ratio": 1.7705382436260624, "no_speech_prob": 0.0013257600367069244}, {"id": 464, "seek": 14284, "start": 2519.8, "end": 2521.6, "text": " out of these AI powered earbuds.", "tokens": [51212, 484, 295, 613, 7318, 17786, 40441, 13, 51302], "temperature": 0.0, "avg_logprob": -0.1550657232833463, "compression_ratio": 1.7705382436260624, "no_speech_prob": 0.0013257600367069244}, {"id": 465, "seek": 14284, "start": 2521.6, "end": 2525.56, "text": " Well, actually, at least Sam says the Johnny Ive thing is still live.", "tokens": [51302, 1042, 11, 767, 11, 412, 1935, 4832, 1619, 264, 15999, 286, 303, 551, 307, 920, 1621, 13, 51500], "temperature": 0.0, "avg_logprob": -0.1550657232833463, "compression_ratio": 1.7705382436260624, "no_speech_prob": 0.0013257600367069244}, {"id": 466, "seek": 14284, "start": 2525.56, "end": 2528.0, "text": " And so again, it's a question of what about the hits", "tokens": [51500, 400, 370, 797, 11, 309, 311, 257, 1168, 295, 437, 466, 264, 8664, 51622], "temperature": 0.0, "avg_logprob": -0.1550657232833463, "compression_ratio": 1.7705382436260624, "no_speech_prob": 0.0013257600367069244}, {"id": 467, "seek": 14284, "start": 2528.0, "end": 2529.52, "text": " not necessarily just the misses,", "tokens": [51622, 406, 4725, 445, 264, 29394, 11, 51698], "temperature": 0.0, "avg_logprob": -0.1550657232833463, "compression_ratio": 1.7705382436260624, "no_speech_prob": 0.0013257600367069244}, {"id": 468, "seek": 14284, "start": 2529.52, "end": 2531.68, "text": " but some of the hits now are worth protecting.", "tokens": [51698, 457, 512, 295, 264, 8664, 586, 366, 3163, 12316, 13, 51806], "temperature": 0.0, "avg_logprob": -0.1550657232833463, "compression_ratio": 1.7705382436260624, "no_speech_prob": 0.0013257600367069244}, {"id": 469, "seek": 14284, "start": 2531.68, "end": 2532.52, "text": " And that's a challenge.", "tokens": [51806, 400, 300, 311, 257, 3430, 13, 51848], "temperature": 0.0, "avg_logprob": -0.1550657232833463, "compression_ratio": 1.7705382436260624, "no_speech_prob": 0.0013257600367069244}, {"id": 470, "seek": 17252, "start": 2532.52, "end": 2534.48, "text": " If you start to grab the pot, Sam needs to find a way", "tokens": [50364, 759, 291, 722, 281, 4444, 264, 1847, 11, 4832, 2203, 281, 915, 257, 636, 50462], "temperature": 0.0, "avg_logprob": -0.2789357190823737, "compression_ratio": 1.5892255892255893, "no_speech_prob": 0.02241685427725315}, {"id": 471, "seek": 17252, "start": 2534.48, "end": 2536.56, "text": " to hold onto it and expand his territory.", "tokens": [50462, 281, 1797, 3911, 309, 293, 5268, 702, 11360, 13, 50566], "temperature": 0.0, "avg_logprob": -0.2789357190823737, "compression_ratio": 1.5892255892255893, "no_speech_prob": 0.02241685427725315}, {"id": 472, "seek": 17252, "start": 2536.56, "end": 2538.6, "text": " It's not just a land grab.", "tokens": [50566, 467, 311, 406, 445, 257, 2117, 4444, 13, 50668], "temperature": 0.0, "avg_logprob": -0.2789357190823737, "compression_ratio": 1.5892255892255893, "no_speech_prob": 0.02241685427725315}, {"id": 473, "seek": 17252, "start": 2538.6, "end": 2539.6, "text": " Right.", "tokens": [50668, 1779, 13, 50718], "temperature": 0.0, "avg_logprob": -0.2789357190823737, "compression_ratio": 1.5892255892255893, "no_speech_prob": 0.02241685427725315}, {"id": 474, "seek": 17252, "start": 2539.6, "end": 2542.76, "text": " And I think to be fair, Cloud Code initially, the story goes", "tokens": [50718, 400, 286, 519, 281, 312, 3143, 11, 8061, 15549, 9105, 11, 264, 1657, 1709, 50876], "temperature": 0.0, "avg_logprob": -0.2789357190823737, "compression_ratio": 1.5892255892255893, "no_speech_prob": 0.02241685427725315}, {"id": 475, "seek": 17252, "start": 2542.76, "end": 2545.6, "text": " was a single person's kind of side project", "tokens": [50876, 390, 257, 2167, 954, 311, 733, 295, 1252, 1716, 51018], "temperature": 0.0, "avg_logprob": -0.2789357190823737, "compression_ratio": 1.5892255892255893, "no_speech_prob": 0.02241685427725315}, {"id": 476, "seek": 17252, "start": 2545.6, "end": 2549.24, "text": " and it turned into a massive thing because someone pursued it.", "tokens": [51018, 293, 309, 3574, 666, 257, 5994, 551, 570, 1580, 34893, 309, 13, 51200], "temperature": 0.0, "avg_logprob": -0.2789357190823737, "compression_ratio": 1.5892255892255893, "no_speech_prob": 0.02241685427725315}, {"id": 477, "seek": 17252, "start": 2549.24, "end": 2551.36, "text": " Really, what this perhaps indicates", "tokens": [51200, 4083, 11, 437, 341, 4317, 16203, 51306], "temperature": 0.0, "avg_logprob": -0.2789357190823737, "compression_ratio": 1.5892255892255893, "no_speech_prob": 0.02241685427725315}, {"id": 478, "seek": 17252, "start": 2551.36, "end": 2555.6, "text": " is the various bets that open eyes have been making", "tokens": [51306, 307, 264, 3683, 39922, 300, 1269, 2575, 362, 668, 1455, 51518], "temperature": 0.0, "avg_logprob": -0.2789357190823737, "compression_ratio": 1.5892255892255893, "no_speech_prob": 0.02241685427725315}, {"id": 479, "seek": 17252, "start": 2555.6, "end": 2557.92, "text": " with the browser, with the SOAR app.", "tokens": [51518, 365, 264, 11185, 11, 365, 264, 10621, 1899, 724, 13, 51634], "temperature": 0.0, "avg_logprob": -0.2789357190823737, "compression_ratio": 1.5892255892255893, "no_speech_prob": 0.02241685427725315}, {"id": 480, "seek": 17252, "start": 2557.92, "end": 2562.04, "text": " They haven't seemed to really be a hit or like BS.", "tokens": [51634, 814, 2378, 380, 6576, 281, 534, 312, 257, 2045, 420, 411, 27253, 13, 51840], "temperature": 0.0, "avg_logprob": -0.2789357190823737, "compression_ratio": 1.5892255892255893, "no_speech_prob": 0.02241685427725315}, {"id": 481, "seek": 20204, "start": 2562.04, "end": 2564.44, "text": " Nearly as transformational as Cloud Code.", "tokens": [50364, 38000, 382, 4088, 1478, 382, 8061, 15549, 13, 50484], "temperature": 0.0, "avg_logprob": -0.21998268158539483, "compression_ratio": 1.6741573033707866, "no_speech_prob": 0.002396176801994443}, {"id": 482, "seek": 20204, "start": 2564.44, "end": 2567.0, "text": " So they missed out on the Cloud Code moment.", "tokens": [50484, 407, 436, 6721, 484, 322, 264, 8061, 15549, 1623, 13, 50612], "temperature": 0.0, "avg_logprob": -0.21998268158539483, "compression_ratio": 1.6741573033707866, "no_speech_prob": 0.002396176801994443}, {"id": 483, "seek": 20204, "start": 2567.0, "end": 2569.48, "text": " They started catching up late with Codex.", "tokens": [50612, 814, 1409, 16124, 493, 3469, 365, 15549, 87, 13, 50736], "temperature": 0.0, "avg_logprob": -0.21998268158539483, "compression_ratio": 1.6741573033707866, "no_speech_prob": 0.002396176801994443}, {"id": 484, "seek": 20204, "start": 2569.48, "end": 2572.92, "text": " It wasn't until kind of pretty late into last year", "tokens": [50736, 467, 2067, 380, 1826, 733, 295, 1238, 3469, 666, 1036, 1064, 50908], "temperature": 0.0, "avg_logprob": -0.21998268158539483, "compression_ratio": 1.6741573033707866, "no_speech_prob": 0.002396176801994443}, {"id": 485, "seek": 20204, "start": 2572.92, "end": 2576.44, "text": " that I started feeling like they're really taking it seriously.", "tokens": [50908, 300, 286, 1409, 2633, 411, 436, 434, 534, 1940, 309, 6638, 13, 51084], "temperature": 0.0, "avg_logprob": -0.21998268158539483, "compression_ratio": 1.6741573033707866, "no_speech_prob": 0.002396176801994443}, {"id": 486, "seek": 20204, "start": 2576.44, "end": 2580.12, "text": " Already by mid year, if you were kind of feeling a pulse", "tokens": [51084, 23741, 538, 2062, 1064, 11, 498, 291, 645, 733, 295, 2633, 257, 17709, 51268], "temperature": 0.0, "avg_logprob": -0.21998268158539483, "compression_ratio": 1.6741573033707866, "no_speech_prob": 0.002396176801994443}, {"id": 487, "seek": 20204, "start": 2580.12, "end": 2584.24, "text": " of where AI was like people were calling Cloud Code.", "tokens": [51268, 295, 689, 7318, 390, 411, 561, 645, 5141, 8061, 15549, 13, 51474], "temperature": 0.0, "avg_logprob": -0.21998268158539483, "compression_ratio": 1.6741573033707866, "no_speech_prob": 0.002396176801994443}, {"id": 488, "seek": 20204, "start": 2584.24, "end": 2588.8, "text": " And you know, I adopted it sometime around May or June.", "tokens": [51474, 400, 291, 458, 11, 286, 12175, 309, 15053, 926, 1891, 420, 6928, 13, 51702], "temperature": 0.0, "avg_logprob": -0.21998268158539483, "compression_ratio": 1.6741573033707866, "no_speech_prob": 0.002396176801994443}, {"id": 489, "seek": 20204, "start": 2588.8, "end": 2591.6, "text": " Codex didn't really start kicking off", "tokens": [51702, 15549, 87, 994, 380, 534, 722, 19137, 766, 51842], "temperature": 0.0, "avg_logprob": -0.21998268158539483, "compression_ratio": 1.6741573033707866, "no_speech_prob": 0.002396176801994443}, {"id": 490, "seek": 23160, "start": 2591.64, "end": 2594.56, "text": " until maybe even November or later.", "tokens": [50366, 1826, 1310, 754, 7674, 420, 1780, 13, 50512], "temperature": 0.0, "avg_logprob": -0.21158111204794788, "compression_ratio": 1.6744186046511629, "no_speech_prob": 0.001335268490947783}, {"id": 491, "seek": 23160, "start": 2594.56, "end": 2598.64, "text": " So yeah, it really signals kind of more,", "tokens": [50512, 407, 1338, 11, 309, 534, 12354, 733, 295, 544, 11, 50716], "temperature": 0.0, "avg_logprob": -0.21158111204794788, "compression_ratio": 1.6744186046511629, "no_speech_prob": 0.001335268490947783}, {"id": 492, "seek": 23160, "start": 2598.64, "end": 2601.44, "text": " not necessarily not doing these other things,", "tokens": [50716, 406, 4725, 406, 884, 613, 661, 721, 11, 50856], "temperature": 0.0, "avg_logprob": -0.21158111204794788, "compression_ratio": 1.6744186046511629, "no_speech_prob": 0.001335268490947783}, {"id": 493, "seek": 23160, "start": 2601.44, "end": 2603.96, "text": " but realizing that they were left behind", "tokens": [50856, 457, 16734, 300, 436, 645, 1411, 2261, 50982], "temperature": 0.0, "avg_logprob": -0.21158111204794788, "compression_ratio": 1.6744186046511629, "no_speech_prob": 0.001335268490947783}, {"id": 494, "seek": 23160, "start": 2603.96, "end": 2606.56, "text": " and now we need to catch up and start making money", "tokens": [50982, 293, 586, 321, 643, 281, 3745, 493, 293, 722, 1455, 1460, 51112], "temperature": 0.0, "avg_logprob": -0.21158111204794788, "compression_ratio": 1.6744186046511629, "no_speech_prob": 0.001335268490947783}, {"id": 495, "seek": 23160, "start": 2606.56, "end": 2609.32, "text": " with businesses because that's as we often say", "tokens": [51112, 365, 6011, 570, 300, 311, 382, 321, 2049, 584, 51250], "temperature": 0.0, "avg_logprob": -0.21158111204794788, "compression_ratio": 1.6744186046511629, "no_speech_prob": 0.001335268490947783}, {"id": 496, "seek": 23160, "start": 2609.32, "end": 2612.28, "text": " where you make your money much more easily", "tokens": [51250, 689, 291, 652, 428, 1460, 709, 544, 3612, 51398], "temperature": 0.0, "avg_logprob": -0.21158111204794788, "compression_ratio": 1.6744186046511629, "no_speech_prob": 0.001335268490947783}, {"id": 497, "seek": 23160, "start": 2612.28, "end": 2614.16, "text": " than with consumer products.", "tokens": [51398, 813, 365, 9711, 3383, 13, 51492], "temperature": 0.0, "avg_logprob": -0.21158111204794788, "compression_ratio": 1.6744186046511629, "no_speech_prob": 0.001335268490947783}, {"id": 498, "seek": 23160, "start": 2614.16, "end": 2616.48, "text": " Yeah, much more, you ultimately need to go straight", "tokens": [51492, 865, 11, 709, 544, 11, 291, 6284, 643, 281, 352, 2997, 51608], "temperature": 0.0, "avg_logprob": -0.21158111204794788, "compression_ratio": 1.6744186046511629, "no_speech_prob": 0.001335268490947783}, {"id": 499, "seek": 23160, "start": 2616.48, "end": 2619.4, "text": " to consumer to get the big pot down on others.", "tokens": [51608, 281, 9711, 281, 483, 264, 955, 1847, 760, 322, 2357, 13, 51754], "temperature": 0.0, "avg_logprob": -0.21158111204794788, "compression_ratio": 1.6744186046511629, "no_speech_prob": 0.001335268490947783}, {"id": 500, "seek": 25940, "start": 2619.64, "end": 2621.84, "text": " But it's early in the get like it is always early", "tokens": [50376, 583, 309, 311, 2440, 294, 264, 483, 411, 309, 307, 1009, 2440, 50486], "temperature": 0.0, "avg_logprob": -0.1634380445771274, "compression_ratio": 1.780058651026393, "no_speech_prob": 0.0027799620293080807}, {"id": 501, "seek": 25940, "start": 2621.84, "end": 2623.36, "text": " in the game and AI and to your point,", "tokens": [50486, 294, 264, 1216, 293, 7318, 293, 281, 428, 935, 11, 50562], "temperature": 0.0, "avg_logprob": -0.1634380445771274, "compression_ratio": 1.780058651026393, "no_speech_prob": 0.0027799620293080807}, {"id": 502, "seek": 25940, "start": 2623.36, "end": 2626.56, "text": " I mean, this is, and I can't remember if this is a Napoleon thing", "tokens": [50562, 286, 914, 11, 341, 307, 11, 293, 286, 393, 380, 1604, 498, 341, 307, 257, 31694, 551, 50722], "temperature": 0.0, "avg_logprob": -0.1634380445771274, "compression_ratio": 1.780058651026393, "no_speech_prob": 0.0027799620293080807}, {"id": 503, "seek": 25940, "start": 2626.56, "end": 2627.68, "text": " or an Alexander the Great thing,", "tokens": [50722, 420, 364, 14845, 264, 3769, 551, 11, 50778], "temperature": 0.0, "avg_logprob": -0.1634380445771274, "compression_ratio": 1.780058651026393, "no_speech_prob": 0.0027799620293080807}, {"id": 504, "seek": 25940, "start": 2627.68, "end": 2630.2799999999997, "text": " but people would say like he could achieve a victory", "tokens": [50778, 457, 561, 576, 584, 411, 415, 727, 4584, 257, 9812, 50908], "temperature": 0.0, "avg_logprob": -0.1634380445771274, "compression_ratio": 1.780058651026393, "no_speech_prob": 0.0027799620293080807}, {"id": 505, "seek": 25940, "start": 2630.2799999999997, "end": 2631.4, "text": " but not use it.", "tokens": [50908, 457, 406, 764, 309, 13, 50964], "temperature": 0.0, "avg_logprob": -0.1634380445771274, "compression_ratio": 1.780058651026393, "no_speech_prob": 0.0027799620293080807}, {"id": 506, "seek": 25940, "start": 2631.4, "end": 2632.48, "text": " This is kind of that, right?", "tokens": [50964, 639, 307, 733, 295, 300, 11, 558, 30, 51018], "temperature": 0.0, "avg_logprob": -0.1634380445771274, "compression_ratio": 1.780058651026393, "no_speech_prob": 0.0027799620293080807}, {"id": 507, "seek": 25940, "start": 2632.48, "end": 2634.08, "text": " There's two different modes in the space.", "tokens": [51018, 821, 311, 732, 819, 14068, 294, 264, 1901, 13, 51098], "temperature": 0.0, "avg_logprob": -0.1634380445771274, "compression_ratio": 1.780058651026393, "no_speech_prob": 0.0027799620293080807}, {"id": 508, "seek": 25940, "start": 2634.08, "end": 2636.36, "text": " First is getting your product market fit", "tokens": [51098, 2386, 307, 1242, 428, 1674, 2142, 3318, 51212], "temperature": 0.0, "avg_logprob": -0.1634380445771274, "compression_ratio": 1.780058651026393, "no_speech_prob": 0.0027799620293080807}, {"id": 509, "seek": 25940, "start": 2636.36, "end": 2638.84, "text": " and then it's holding on to it in the face of competition", "tokens": [51212, 293, 550, 309, 311, 5061, 322, 281, 309, 294, 264, 1851, 295, 6211, 51336], "temperature": 0.0, "avg_logprob": -0.1634380445771274, "compression_ratio": 1.780058651026393, "no_speech_prob": 0.0027799620293080807}, {"id": 510, "seek": 25940, "start": 2638.84, "end": 2642.0, "text": " and then kind of attacking other people's modes.", "tokens": [51336, 293, 550, 733, 295, 15010, 661, 561, 311, 14068, 13, 51494], "temperature": 0.0, "avg_logprob": -0.1634380445771274, "compression_ratio": 1.780058651026393, "no_speech_prob": 0.0027799620293080807}, {"id": 511, "seek": 25940, "start": 2642.0, "end": 2644.2799999999997, "text": " And that second one is it's a shift.", "tokens": [51494, 400, 300, 1150, 472, 307, 309, 311, 257, 5513, 13, 51608], "temperature": 0.0, "avg_logprob": -0.1634380445771274, "compression_ratio": 1.780058651026393, "no_speech_prob": 0.0027799620293080807}, {"id": 512, "seek": 25940, "start": 2644.2799999999997, "end": 2646.32, "text": " It's a fundamental kind of mindset shift", "tokens": [51608, 467, 311, 257, 8088, 733, 295, 12543, 5513, 51710], "temperature": 0.0, "avg_logprob": -0.1634380445771274, "compression_ratio": 1.780058651026393, "no_speech_prob": 0.0027799620293080807}, {"id": 513, "seek": 25940, "start": 2646.32, "end": 2648.8, "text": " that you got to get into and it looks different for AI", "tokens": [51710, 300, 291, 658, 281, 483, 666, 293, 309, 1542, 819, 337, 7318, 51834], "temperature": 0.0, "avg_logprob": -0.1634380445771274, "compression_ratio": 1.780058651026393, "no_speech_prob": 0.0027799620293080807}, {"id": 514, "seek": 28880, "start": 2648.8, "end": 2650.8, "text": " and for traditional SaaS companies too.", "tokens": [50364, 293, 337, 5164, 49733, 3431, 886, 13, 50464], "temperature": 0.0, "avg_logprob": -0.15718867975895798, "compression_ratio": 1.597938144329897, "no_speech_prob": 0.0019181035459041595}, {"id": 515, "seek": 28880, "start": 2650.8, "end": 2652.76, "text": " Like you just can't sit on a stack of software", "tokens": [50464, 1743, 291, 445, 393, 380, 1394, 322, 257, 8630, 295, 4722, 50562], "temperature": 0.0, "avg_logprob": -0.15718867975895798, "compression_ratio": 1.597938144329897, "no_speech_prob": 0.0019181035459041595}, {"id": 516, "seek": 28880, "start": 2652.76, "end": 2655.44, "text": " for like a month and expect it to hold on to market share.", "tokens": [50562, 337, 411, 257, 1618, 293, 2066, 309, 281, 1797, 322, 281, 2142, 2073, 13, 50696], "temperature": 0.0, "avg_logprob": -0.15718867975895798, "compression_ratio": 1.597938144329897, "no_speech_prob": 0.0019181035459041595}, {"id": 517, "seek": 28880, "start": 2655.44, "end": 2657.16, "text": " So yeah, it's a really interesting space.", "tokens": [50696, 407, 1338, 11, 309, 311, 257, 534, 1880, 1901, 13, 50782], "temperature": 0.0, "avg_logprob": -0.15718867975895798, "compression_ratio": 1.597938144329897, "no_speech_prob": 0.0019181035459041595}, {"id": 518, "seek": 28880, "start": 2657.16, "end": 2658.96, "text": " We're learning a lot in real time right now", "tokens": [50782, 492, 434, 2539, 257, 688, 294, 957, 565, 558, 586, 50872], "temperature": 0.0, "avg_logprob": -0.15718867975895798, "compression_ratio": 1.597938144329897, "no_speech_prob": 0.0019181035459041595}, {"id": 519, "seek": 28880, "start": 2658.96, "end": 2661.28, "text": " about what it takes to gain market share", "tokens": [50872, 466, 437, 309, 2516, 281, 6052, 2142, 2073, 50988], "temperature": 0.0, "avg_logprob": -0.15718867975895798, "compression_ratio": 1.597938144329897, "no_speech_prob": 0.0019181035459041595}, {"id": 520, "seek": 28880, "start": 2661.28, "end": 2663.7200000000003, "text": " and hold market share in a world of low switching costs", "tokens": [50988, 293, 1797, 2142, 2073, 294, 257, 1002, 295, 2295, 16493, 5497, 51110], "temperature": 0.0, "avg_logprob": -0.15718867975895798, "compression_ratio": 1.597938144329897, "no_speech_prob": 0.0019181035459041595}, {"id": 521, "seek": 28880, "start": 2663.7200000000003, "end": 2665.44, "text": " and very rapid advancement.", "tokens": [51110, 293, 588, 7558, 35764, 13, 51196], "temperature": 0.0, "avg_logprob": -0.15718867975895798, "compression_ratio": 1.597938144329897, "no_speech_prob": 0.0019181035459041595}, {"id": 522, "seek": 28880, "start": 2666.32, "end": 2669.56, "text": " Next up, moving back to GTC.", "tokens": [51240, 3087, 493, 11, 2684, 646, 281, 460, 18238, 13, 51402], "temperature": 0.0, "avg_logprob": -0.15718867975895798, "compression_ratio": 1.597938144329897, "no_speech_prob": 0.0019181035459041595}, {"id": 523, "seek": 28880, "start": 2669.56, "end": 2671.32, "text": " We have some more details.", "tokens": [51402, 492, 362, 512, 544, 4365, 13, 51490], "temperature": 0.0, "avg_logprob": -0.15718867975895798, "compression_ratio": 1.597938144329897, "no_speech_prob": 0.0019181035459041595}, {"id": 524, "seek": 28880, "start": 2671.32, "end": 2675.7200000000003, "text": " One other thing that Jensen Huang, the CEO and video", "tokens": [51490, 1485, 661, 551, 300, 508, 32934, 28073, 11, 264, 9282, 293, 960, 51710], "temperature": 0.0, "avg_logprob": -0.15718867975895798, "compression_ratio": 1.597938144329897, "no_speech_prob": 0.0019181035459041595}, {"id": 525, "seek": 31572, "start": 2675.76, "end": 2680.36, "text": " has announced is that it looks like purchase orders", "tokens": [50366, 575, 7548, 307, 300, 309, 1542, 411, 8110, 9470, 50596], "temperature": 0.0, "avg_logprob": -0.1724026807366985, "compression_ratio": 1.4108910891089108, "no_speech_prob": 0.0024609535466879606}, {"id": 526, "seek": 31572, "start": 2680.36, "end": 2684.04, "text": " for Blackwell and Vera Rubin chips.", "tokens": [50596, 337, 4076, 6326, 293, 46982, 10518, 259, 11583, 13, 50780], "temperature": 0.0, "avg_logprob": -0.1724026807366985, "compression_ratio": 1.4108910891089108, "no_speech_prob": 0.0024609535466879606}, {"id": 527, "seek": 31572, "start": 2684.04, "end": 2687.28, "text": " These are the newest generation of chips.", "tokens": [50780, 1981, 366, 264, 17569, 5125, 295, 11583, 13, 50942], "temperature": 0.0, "avg_logprob": -0.1724026807366985, "compression_ratio": 1.4108910891089108, "no_speech_prob": 0.0024609535466879606}, {"id": 528, "seek": 31572, "start": 2687.28, "end": 2691.36, "text": " I expected to reach $1 trillion through 2027", "tokens": [50942, 286, 5176, 281, 2524, 1848, 16, 18723, 807, 945, 10076, 51146], "temperature": 0.0, "avg_logprob": -0.1724026807366985, "compression_ratio": 1.4108910891089108, "no_speech_prob": 0.0024609535466879606}, {"id": 529, "seek": 31572, "start": 2691.36, "end": 2696.36, "text": " bubbling last year's projection of 500 billion revenue", "tokens": [51146, 46360, 1036, 1064, 311, 22743, 295, 5923, 5218, 9324, 51396], "temperature": 0.0, "avg_logprob": -0.1724026807366985, "compression_ratio": 1.4108910891089108, "no_speech_prob": 0.0024609535466879606}, {"id": 530, "seek": 31572, "start": 2696.84, "end": 2698.04, "text": " opportunity.", "tokens": [51420, 2650, 13, 51480], "temperature": 0.0, "avg_logprob": -0.1724026807366985, "compression_ratio": 1.4108910891089108, "no_speech_prob": 0.0024609535466879606}, {"id": 531, "seek": 31572, "start": 2698.04, "end": 2701.36, "text": " He's saying that the key that's driving it", "tokens": [51480, 634, 311, 1566, 300, 264, 2141, 300, 311, 4840, 309, 51646], "temperature": 0.0, "avg_logprob": -0.1724026807366985, "compression_ratio": 1.4108910891089108, "no_speech_prob": 0.0024609535466879606}, {"id": 532, "seek": 34136, "start": 2701.36, "end": 2705.7200000000003, "text": " is a shift from chatbots to agent AI applications", "tokens": [50364, 307, 257, 5513, 490, 5081, 65, 1971, 281, 9461, 7318, 5821, 50582], "temperature": 0.0, "avg_logprob": -0.1997612307554689, "compression_ratio": 1.5862068965517242, "no_speech_prob": 0.004445742350071669}, {"id": 533, "seek": 34136, "start": 2705.7200000000003, "end": 2708.16, "text": " which do require much more compute.", "tokens": [50582, 597, 360, 3651, 709, 544, 14722, 13, 50704], "temperature": 0.0, "avg_logprob": -0.1997612307554689, "compression_ratio": 1.5862068965517242, "no_speech_prob": 0.004445742350071669}, {"id": 534, "seek": 34136, "start": 2708.16, "end": 2710.96, "text": " You're just generating a lot more output.", "tokens": [50704, 509, 434, 445, 17746, 257, 688, 544, 5598, 13, 50844], "temperature": 0.0, "avg_logprob": -0.1997612307554689, "compression_ratio": 1.5862068965517242, "no_speech_prob": 0.004445742350071669}, {"id": 535, "seek": 34136, "start": 2710.96, "end": 2714.04, "text": " And if you have open cloud, which isn't always on agent,", "tokens": [50844, 400, 498, 291, 362, 1269, 4588, 11, 597, 1943, 380, 1009, 322, 9461, 11, 50998], "temperature": 0.0, "avg_logprob": -0.1997612307554689, "compression_ratio": 1.5862068965517242, "no_speech_prob": 0.004445742350071669}, {"id": 536, "seek": 34136, "start": 2714.04, "end": 2717.92, "text": " but you let go off and work for hours at a time.", "tokens": [50998, 457, 291, 718, 352, 766, 293, 589, 337, 2496, 412, 257, 565, 13, 51192], "temperature": 0.0, "avg_logprob": -0.1997612307554689, "compression_ratio": 1.5862068965517242, "no_speech_prob": 0.004445742350071669}, {"id": 537, "seek": 34136, "start": 2717.92, "end": 2721.64, "text": " And this is where things are heading, at least if you're", "tokens": [51192, 400, 341, 307, 689, 721, 366, 9864, 11, 412, 1935, 498, 291, 434, 51378], "temperature": 0.0, "avg_logprob": -0.1997612307554689, "compression_ratio": 1.5862068965517242, "no_speech_prob": 0.004445742350071669}, {"id": 538, "seek": 34136, "start": 2721.64, "end": 2724.08, "text": " kind of a believer in the trends, right?", "tokens": [51378, 733, 295, 257, 23892, 294, 264, 13892, 11, 558, 30, 51500], "temperature": 0.0, "avg_logprob": -0.1997612307554689, "compression_ratio": 1.5862068965517242, "no_speech_prob": 0.004445742350071669}, {"id": 539, "seek": 34136, "start": 2724.08, "end": 2726.44, "text": " They've been half a year, a year.", "tokens": [51500, 814, 600, 668, 1922, 257, 1064, 11, 257, 1064, 13, 51618], "temperature": 0.0, "avg_logprob": -0.1997612307554689, "compression_ratio": 1.5862068965517242, "no_speech_prob": 0.004445742350071669}, {"id": 540, "seek": 34136, "start": 2726.44, "end": 2729.12, "text": " We could expect agents just working on their own", "tokens": [51618, 492, 727, 2066, 12554, 445, 1364, 322, 641, 1065, 51752], "temperature": 0.0, "avg_logprob": -0.1997612307554689, "compression_ratio": 1.5862068965517242, "no_speech_prob": 0.004445742350071669}, {"id": 541, "seek": 36912, "start": 2729.12, "end": 2732.88, "text": " for many hours at a time, even days at a time.", "tokens": [50364, 337, 867, 2496, 412, 257, 565, 11, 754, 1708, 412, 257, 565, 13, 50552], "temperature": 0.0, "avg_logprob": -0.15058182858758504, "compression_ratio": 1.5483870967741935, "no_speech_prob": 0.013132276013493538}, {"id": 542, "seek": 36912, "start": 2732.88, "end": 2737.8, "text": " And so the bet is that it's going to keep happening.", "tokens": [50552, 400, 370, 264, 778, 307, 300, 309, 311, 516, 281, 1066, 2737, 13, 50798], "temperature": 0.0, "avg_logprob": -0.15058182858758504, "compression_ratio": 1.5483870967741935, "no_speech_prob": 0.013132276013493538}, {"id": 543, "seek": 36912, "start": 2737.8, "end": 2742.8, "text": " They also unveiled the GROC free language processing unit", "tokens": [50798, 814, 611, 47430, 264, 460, 7142, 34, 1737, 2856, 9007, 4985, 51048], "temperature": 0.0, "avg_logprob": -0.15058182858758504, "compression_ratio": 1.5483870967741935, "no_speech_prob": 0.013132276013493538}, {"id": 544, "seek": 36912, "start": 2743.32, "end": 2746.64, "text": " which is the first announcement related to GROC", "tokens": [51074, 597, 307, 264, 700, 12847, 4077, 281, 460, 7142, 34, 51240], "temperature": 0.0, "avg_logprob": -0.15058182858758504, "compression_ratio": 1.5483870967741935, "no_speech_prob": 0.013132276013493538}, {"id": 545, "seek": 36912, "start": 2746.64, "end": 2751.64, "text": " since their 20 billion asset purchase in December.", "tokens": [51240, 1670, 641, 945, 5218, 11999, 8110, 294, 7687, 13, 51490], "temperature": 0.0, "avg_logprob": -0.15058182858758504, "compression_ratio": 1.5483870967741935, "no_speech_prob": 0.013132276013493538}, {"id": 546, "seek": 36912, "start": 2751.64, "end": 2754.04, "text": " GROC, their language processing unit,", "tokens": [51490, 460, 7142, 34, 11, 641, 2856, 9007, 4985, 11, 51610], "temperature": 0.0, "avg_logprob": -0.15058182858758504, "compression_ratio": 1.5483870967741935, "no_speech_prob": 0.013132276013493538}, {"id": 547, "seek": 36912, "start": 2754.04, "end": 2756.68, "text": " has been a very impressive piece of work.", "tokens": [51610, 575, 668, 257, 588, 8992, 2522, 295, 589, 13, 51742], "temperature": 0.0, "avg_logprob": -0.15058182858758504, "compression_ratio": 1.5483870967741935, "no_speech_prob": 0.013132276013493538}, {"id": 548, "seek": 39668, "start": 2756.68, "end": 2759.48, "text": " GROC has really high throughput time", "tokens": [50364, 460, 7142, 34, 575, 534, 1090, 44629, 565, 50504], "temperature": 0.0, "avg_logprob": -0.20450376091258868, "compression_ratio": 1.5723270440251573, "no_speech_prob": 0.06866530328989029}, {"id": 549, "seek": 39668, "start": 2759.48, "end": 2763.12, "text": " and it is just a great option for running inference", "tokens": [50504, 293, 309, 307, 445, 257, 869, 3614, 337, 2614, 38253, 50686], "temperature": 0.0, "avg_logprob": -0.20450376091258868, "compression_ratio": 1.5723270440251573, "no_speech_prob": 0.06866530328989029}, {"id": 550, "seek": 39668, "start": 2763.12, "end": 2764.32, "text": " if you need speed.", "tokens": [50686, 498, 291, 643, 3073, 13, 50746], "temperature": 0.0, "avg_logprob": -0.20450376091258868, "compression_ratio": 1.5723270440251573, "no_speech_prob": 0.06866530328989029}, {"id": 551, "seek": 39668, "start": 2764.32, "end": 2767.92, "text": " So that I think is also quite significant.", "tokens": [50746, 407, 300, 286, 519, 307, 611, 1596, 4776, 13, 50926], "temperature": 0.0, "avg_logprob": -0.20450376091258868, "compression_ratio": 1.5723270440251573, "no_speech_prob": 0.06866530328989029}, {"id": 552, "seek": 39668, "start": 2767.92, "end": 2770.8, "text": " Yeah, that's a rate of 20 billion dollar purchase", "tokens": [50926, 865, 11, 300, 311, 257, 3314, 295, 945, 5218, 7241, 8110, 51070], "temperature": 0.0, "avg_logprob": -0.20450376091258868, "compression_ratio": 1.5723270440251573, "no_speech_prob": 0.06866530328989029}, {"id": 553, "seek": 39668, "start": 2770.8, "end": 2773.32, "text": " of GROC by Nvidia a few months ago.", "tokens": [51070, 295, 460, 7142, 34, 538, 46284, 257, 1326, 2493, 2057, 13, 51196], "temperature": 0.0, "avg_logprob": -0.20450376091258868, "compression_ratio": 1.5723270440251573, "no_speech_prob": 0.06866530328989029}, {"id": 554, "seek": 39668, "start": 2773.32, "end": 2775.56, "text": " And yeah, I guess that was December.", "tokens": [51196, 400, 1338, 11, 286, 2041, 300, 390, 7687, 13, 51308], "temperature": 0.0, "avg_logprob": -0.20450376091258868, "compression_ratio": 1.5723270440251573, "no_speech_prob": 0.06866530328989029}, {"id": 555, "seek": 39668, "start": 2775.56, "end": 2777.28, "text": " Huge deal obviously.", "tokens": [51308, 37043, 2028, 2745, 13, 51394], "temperature": 0.0, "avg_logprob": -0.20450376091258868, "compression_ratio": 1.5723270440251573, "no_speech_prob": 0.06866530328989029}, {"id": 556, "seek": 39668, "start": 2777.28, "end": 2779.04, "text": " Already, they're looking to ship a product", "tokens": [51394, 23741, 11, 436, 434, 1237, 281, 5374, 257, 1674, 51482], "temperature": 0.0, "avg_logprob": -0.20450376091258868, "compression_ratio": 1.5723270440251573, "no_speech_prob": 0.06866530328989029}, {"id": 557, "seek": 39668, "start": 2779.04, "end": 2779.88, "text": " in the third quarter of this.", "tokens": [51482, 294, 264, 2636, 6555, 295, 341, 13, 51524], "temperature": 0.0, "avg_logprob": -0.20450376091258868, "compression_ratio": 1.5723270440251573, "no_speech_prob": 0.06866530328989029}, {"id": 558, "seek": 39668, "start": 2779.88, "end": 2780.64, "text": " Yeah, that's fast.", "tokens": [51524, 865, 11, 300, 311, 2370, 13, 51562], "temperature": 0.0, "avg_logprob": -0.20450376091258868, "compression_ratio": 1.5723270440251573, "no_speech_prob": 0.06866530328989029}, {"id": 559, "seek": 39668, "start": 2780.64, "end": 2781.96, "text": " Under a year from acquisition", "tokens": [51562, 6974, 257, 1064, 490, 21668, 51628], "temperature": 0.0, "avg_logprob": -0.20450376091258868, "compression_ratio": 1.5723270440251573, "no_speech_prob": 0.06866530328989029}, {"id": 560, "seek": 39668, "start": 2781.96, "end": 2783.2, "text": " and they're already fully integrated", "tokens": [51628, 293, 436, 434, 1217, 4498, 10919, 51690], "temperature": 0.0, "avg_logprob": -0.20450376091258868, "compression_ratio": 1.5723270440251573, "no_speech_prob": 0.06866530328989029}, {"id": 561, "seek": 39668, "start": 2783.2, "end": 2785.92, "text": " and pumping out stuff just as a reminder here.", "tokens": [51690, 293, 27131, 484, 1507, 445, 382, 257, 13548, 510, 13, 51826], "temperature": 0.0, "avg_logprob": -0.20450376091258868, "compression_ratio": 1.5723270440251573, "no_speech_prob": 0.06866530328989029}, {"id": 562, "seek": 42592, "start": 2785.92, "end": 2788.36, "text": " So when you think about GROC's LPU,", "tokens": [50364, 407, 562, 291, 519, 466, 460, 7142, 34, 311, 441, 8115, 11, 50486], "temperature": 0.0, "avg_logprob": -0.16861798352208632, "compression_ratio": 1.69, "no_speech_prob": 0.0003500048478599638}, {"id": 563, "seek": 42592, "start": 2788.36, "end": 2790.12, "text": " language processing unit,", "tokens": [50486, 2856, 9007, 4985, 11, 50574], "temperature": 0.0, "avg_logprob": -0.16861798352208632, "compression_ratio": 1.69, "no_speech_prob": 0.0003500048478599638}, {"id": 564, "seek": 42592, "start": 2790.12, "end": 2792.28, "text": " go back to our hardware episode to do the deep dive on this.", "tokens": [50574, 352, 646, 281, 527, 8837, 3500, 281, 360, 264, 2452, 9192, 322, 341, 13, 50682], "temperature": 0.0, "avg_logprob": -0.16861798352208632, "compression_ratio": 1.69, "no_speech_prob": 0.0003500048478599638}, {"id": 565, "seek": 42592, "start": 2792.28, "end": 2796.24, "text": " But basically, you traditionally have in any kind of GPU,", "tokens": [50682, 583, 1936, 11, 291, 19067, 362, 294, 604, 733, 295, 18407, 11, 50880], "temperature": 0.0, "avg_logprob": -0.16861798352208632, "compression_ratio": 1.69, "no_speech_prob": 0.0003500048478599638}, {"id": 566, "seek": 42592, "start": 2796.24, "end": 2797.76, "text": " like the Nvidia GPUs,", "tokens": [50880, 411, 264, 46284, 18407, 82, 11, 50956], "temperature": 0.0, "avg_logprob": -0.16861798352208632, "compression_ratio": 1.69, "no_speech_prob": 0.0003500048478599638}, {"id": 567, "seek": 42592, "start": 2797.76, "end": 2800.24, "text": " you'll have a stack of high band with memory, HBM,", "tokens": [50956, 291, 603, 362, 257, 8630, 295, 1090, 4116, 365, 4675, 11, 389, 18345, 11, 51080], "temperature": 0.0, "avg_logprob": -0.16861798352208632, "compression_ratio": 1.69, "no_speech_prob": 0.0003500048478599638}, {"id": 568, "seek": 42592, "start": 2800.24, "end": 2802.7200000000003, "text": " that's going to sit next to the logic die.", "tokens": [51080, 300, 311, 516, 281, 1394, 958, 281, 264, 9952, 978, 13, 51204], "temperature": 0.0, "avg_logprob": -0.16861798352208632, "compression_ratio": 1.69, "no_speech_prob": 0.0003500048478599638}, {"id": 569, "seek": 42592, "start": 2802.7200000000003, "end": 2805.56, "text": " And the logic die is what actually does the matrix math,", "tokens": [51204, 400, 264, 9952, 978, 307, 437, 767, 775, 264, 8141, 5221, 11, 51346], "temperature": 0.0, "avg_logprob": -0.16861798352208632, "compression_ratio": 1.69, "no_speech_prob": 0.0003500048478599638}, {"id": 570, "seek": 42592, "start": 2805.56, "end": 2808.28, "text": " the computations that are interesting, the number crunching,", "tokens": [51346, 264, 2807, 763, 300, 366, 1880, 11, 264, 1230, 13386, 278, 11, 51482], "temperature": 0.0, "avg_logprob": -0.16861798352208632, "compression_ratio": 1.69, "no_speech_prob": 0.0003500048478599638}, {"id": 571, "seek": 42592, "start": 2808.28, "end": 2812.16, "text": " but it has to pull data from that high band with memory,", "tokens": [51482, 457, 309, 575, 281, 2235, 1412, 490, 300, 1090, 4116, 365, 4675, 11, 51676], "temperature": 0.0, "avg_logprob": -0.16861798352208632, "compression_ratio": 1.69, "no_speech_prob": 0.0003500048478599638}, {"id": 572, "seek": 42592, "start": 2812.16, "end": 2814.7200000000003, "text": " the stack of HBM that's next to it.", "tokens": [51676, 264, 8630, 295, 389, 18345, 300, 311, 958, 281, 309, 13, 51804], "temperature": 0.0, "avg_logprob": -0.16861798352208632, "compression_ratio": 1.69, "no_speech_prob": 0.0003500048478599638}, {"id": 573, "seek": 45472, "start": 2814.76, "end": 2816.76, "text": " And pulling that data takes time", "tokens": [50366, 400, 8407, 300, 1412, 2516, 565, 50466], "temperature": 0.0, "avg_logprob": -0.13437581529804304, "compression_ratio": 1.6695906432748537, "no_speech_prob": 0.00021786823344882578}, {"id": 574, "seek": 45472, "start": 2816.76, "end": 2818.96, "text": " because they're physically just like different objects, right?", "tokens": [50466, 570, 436, 434, 9762, 445, 411, 819, 6565, 11, 558, 30, 50576], "temperature": 0.0, "avg_logprob": -0.13437581529804304, "compression_ratio": 1.6695906432748537, "no_speech_prob": 0.00021786823344882578}, {"id": 575, "seek": 45472, "start": 2818.96, "end": 2820.88, "text": " The HBM is a stack and then you have a logic die", "tokens": [50576, 440, 389, 18345, 307, 257, 8630, 293, 550, 291, 362, 257, 9952, 978, 50672], "temperature": 0.0, "avg_logprob": -0.13437581529804304, "compression_ratio": 1.6695906432748537, "no_speech_prob": 0.00021786823344882578}, {"id": 576, "seek": 45472, "start": 2820.88, "end": 2822.28, "text": " and they're packaged together,", "tokens": [50672, 293, 436, 434, 38162, 1214, 11, 50742], "temperature": 0.0, "avg_logprob": -0.13437581529804304, "compression_ratio": 1.6695906432748537, "no_speech_prob": 0.00021786823344882578}, {"id": 577, "seek": 45472, "start": 2822.28, "end": 2824.36, "text": " but you run into this challenge", "tokens": [50742, 457, 291, 1190, 666, 341, 3430, 50846], "temperature": 0.0, "avg_logprob": -0.13437581529804304, "compression_ratio": 1.6695906432748537, "no_speech_prob": 0.00021786823344882578}, {"id": 578, "seek": 45472, "start": 2824.36, "end": 2826.12, "text": " and it just has to travel to the memory,", "tokens": [50846, 293, 309, 445, 575, 281, 3147, 281, 264, 4675, 11, 50934], "temperature": 0.0, "avg_logprob": -0.13437581529804304, "compression_ratio": 1.6695906432748537, "no_speech_prob": 0.00021786823344882578}, {"id": 579, "seek": 45472, "start": 2826.12, "end": 2828.68, "text": " grab the data and come back that creates that memory wall", "tokens": [50934, 4444, 264, 1412, 293, 808, 646, 300, 7829, 300, 4675, 2929, 51062], "temperature": 0.0, "avg_logprob": -0.13437581529804304, "compression_ratio": 1.6695906432748537, "no_speech_prob": 0.00021786823344882578}, {"id": 580, "seek": 45472, "start": 2828.68, "end": 2831.0, "text": " where the processor spends like 70% of its time", "tokens": [51062, 689, 264, 15321, 25620, 411, 5285, 4, 295, 1080, 565, 51178], "temperature": 0.0, "avg_logprob": -0.13437581529804304, "compression_ratio": 1.6695906432748537, "no_speech_prob": 0.00021786823344882578}, {"id": 581, "seek": 45472, "start": 2831.0, "end": 2832.52, "text": " just waiting for stuff, right?", "tokens": [51178, 445, 3806, 337, 1507, 11, 558, 30, 51254], "temperature": 0.0, "avg_logprob": -0.13437581529804304, "compression_ratio": 1.6695906432748537, "no_speech_prob": 0.00021786823344882578}, {"id": 582, "seek": 45472, "start": 2832.52, "end": 2834.8, "text": " So you've got a kind of cold starting issue there.", "tokens": [51254, 407, 291, 600, 658, 257, 733, 295, 3554, 2891, 2734, 456, 13, 51368], "temperature": 0.0, "avg_logprob": -0.13437581529804304, "compression_ratio": 1.6695906432748537, "no_speech_prob": 0.00021786823344882578}, {"id": 583, "seek": 45472, "start": 2834.8, "end": 2836.96, "text": " And the GROC, like language processing units,", "tokens": [51368, 400, 264, 460, 7142, 34, 11, 411, 2856, 9007, 6815, 11, 51476], "temperature": 0.0, "avg_logprob": -0.13437581529804304, "compression_ratio": 1.6695906432748537, "no_speech_prob": 0.00021786823344882578}, {"id": 584, "seek": 45472, "start": 2836.96, "end": 2841.52, "text": " the LPUs, they use a kind of memory called SRAM", "tokens": [51476, 264, 441, 8115, 82, 11, 436, 764, 257, 733, 295, 4675, 1219, 20840, 2865, 51704], "temperature": 0.0, "avg_logprob": -0.13437581529804304, "compression_ratio": 1.6695906432748537, "no_speech_prob": 0.00021786823344882578}, {"id": 585, "seek": 45472, "start": 2841.52, "end": 2843.84, "text": " that is built directly into the silicon.", "tokens": [51704, 300, 307, 3094, 3838, 666, 264, 22848, 13, 51820], "temperature": 0.0, "avg_logprob": -0.13437581529804304, "compression_ratio": 1.6695906432748537, "no_speech_prob": 0.00021786823344882578}, {"id": 586, "seek": 48384, "start": 2843.84, "end": 2846.92, "text": " So the logic and the memory are much more intimately linked.", "tokens": [50364, 407, 264, 9952, 293, 264, 4675, 366, 709, 544, 560, 5401, 9408, 13, 50518], "temperature": 0.0, "avg_logprob": -0.17203043582183974, "compression_ratio": 1.6615384615384616, "no_speech_prob": 0.00017191759252455086}, {"id": 587, "seek": 48384, "start": 2846.92, "end": 2849.36, "text": " So the data doesn't have to travel, it's already there.", "tokens": [50518, 407, 264, 1412, 1177, 380, 362, 281, 3147, 11, 309, 311, 1217, 456, 13, 50640], "temperature": 0.0, "avg_logprob": -0.17203043582183974, "compression_ratio": 1.6615384615384616, "no_speech_prob": 0.00017191759252455086}, {"id": 588, "seek": 48384, "start": 2849.36, "end": 2852.7200000000003, "text": " So you get this massive increase in sort of like internal", "tokens": [50640, 407, 291, 483, 341, 5994, 3488, 294, 1333, 295, 411, 6920, 50808], "temperature": 0.0, "avg_logprob": -0.17203043582183974, "compression_ratio": 1.6615384615384616, "no_speech_prob": 0.00017191759252455086}, {"id": 589, "seek": 48384, "start": 2852.7200000000003, "end": 2854.88, "text": " memory bandwidth, so how much memory,", "tokens": [50808, 4675, 23647, 11, 370, 577, 709, 4675, 11, 50916], "temperature": 0.0, "avg_logprob": -0.17203043582183974, "compression_ratio": 1.6615384615384616, "no_speech_prob": 0.00017191759252455086}, {"id": 590, "seek": 48384, "start": 2854.88, "end": 2857.24, "text": " how much data per second can flow", "tokens": [50916, 577, 709, 1412, 680, 1150, 393, 3095, 51034], "temperature": 0.0, "avg_logprob": -0.17203043582183974, "compression_ratio": 1.6615384615384616, "no_speech_prob": 0.00017191759252455086}, {"id": 591, "seek": 48384, "start": 2857.24, "end": 2859.16, "text": " between the relevant components,", "tokens": [51034, 1296, 264, 7340, 6677, 11, 51130], "temperature": 0.0, "avg_logprob": -0.17203043582183974, "compression_ratio": 1.6615384615384616, "no_speech_prob": 0.00017191759252455086}, {"id": 592, "seek": 48384, "start": 2859.16, "end": 2862.24, "text": " some like 10 to 20% faster on the GROC units.", "tokens": [51130, 512, 411, 1266, 281, 945, 4, 4663, 322, 264, 460, 7142, 34, 6815, 13, 51284], "temperature": 0.0, "avg_logprob": -0.17203043582183974, "compression_ratio": 1.6615384615384616, "no_speech_prob": 0.00017191759252455086}, {"id": 593, "seek": 48384, "start": 2862.24, "end": 2864.16, "text": " So this is really, really important", "tokens": [51284, 407, 341, 307, 534, 11, 534, 1021, 51380], "temperature": 0.0, "avg_logprob": -0.17203043582183974, "compression_ratio": 1.6615384615384616, "no_speech_prob": 0.00017191759252455086}, {"id": 594, "seek": 48384, "start": 2864.16, "end": 2866.0, "text": " because the Nvidia is going to be integrating this", "tokens": [51380, 570, 264, 46284, 307, 516, 281, 312, 26889, 341, 51472], "temperature": 0.0, "avg_logprob": -0.17203043582183974, "compression_ratio": 1.6615384615384616, "no_speech_prob": 0.00017191759252455086}, {"id": 595, "seek": 48384, "start": 2866.0, "end": 2868.16, "text": " with their viewer Rubin architecture.", "tokens": [51472, 365, 641, 16767, 10518, 259, 9482, 13, 51580], "temperature": 0.0, "avg_logprob": -0.17203043582183974, "compression_ratio": 1.6615384615384616, "no_speech_prob": 0.00017191759252455086}, {"id": 596, "seek": 48384, "start": 2868.16, "end": 2870.44, "text": " That's the next generation after Blackwell", "tokens": [51580, 663, 311, 264, 958, 5125, 934, 4076, 6326, 51694], "temperature": 0.0, "avg_logprob": -0.17203043582183974, "compression_ratio": 1.6615384615384616, "no_speech_prob": 0.00017191759252455086}, {"id": 597, "seek": 48384, "start": 2870.44, "end": 2872.84, "text": " and kind of doing a hybrid of these two ideas.", "tokens": [51694, 293, 733, 295, 884, 257, 13051, 295, 613, 732, 3487, 13, 51814], "temperature": 0.0, "avg_logprob": -0.17203043582183974, "compression_ratio": 1.6615384615384616, "no_speech_prob": 0.00017191759252455086}, {"id": 598, "seek": 51284, "start": 2873.32, "end": 2875.28, "text": " The idea is they're going to have HBM", "tokens": [50388, 440, 1558, 307, 436, 434, 516, 281, 362, 389, 18345, 50486], "temperature": 0.0, "avg_logprob": -0.17188859535447248, "compression_ratio": 1.6424050632911393, "no_speech_prob": 0.0006733807967975736}, {"id": 599, "seek": 51284, "start": 2875.28, "end": 2877.36, "text": " sitting outside the main processor,", "tokens": [50486, 3798, 2380, 264, 2135, 15321, 11, 50590], "temperature": 0.0, "avg_logprob": -0.17188859535447248, "compression_ratio": 1.6424050632911393, "no_speech_prob": 0.0006733807967975736}, {"id": 600, "seek": 51284, "start": 2877.36, "end": 2881.36, "text": " but also integrating some of this SRAM heavy compute tiles", "tokens": [50590, 457, 611, 26889, 512, 295, 341, 20840, 2865, 4676, 14722, 21982, 50790], "temperature": 0.0, "avg_logprob": -0.17188859535447248, "compression_ratio": 1.6424050632911393, "no_speech_prob": 0.0006733807967975736}, {"id": 601, "seek": 51284, "start": 2881.36, "end": 2883.6, "text": " that are directly going to be in the Rubin chips.", "tokens": [50790, 300, 366, 3838, 516, 281, 312, 294, 264, 10518, 259, 11583, 13, 50902], "temperature": 0.0, "avg_logprob": -0.17188859535447248, "compression_ratio": 1.6424050632911393, "no_speech_prob": 0.0006733807967975736}, {"id": 602, "seek": 51284, "start": 2883.6, "end": 2886.88, "text": " And so this is a really big merger,", "tokens": [50902, 400, 370, 341, 307, 257, 534, 955, 48002, 11, 51066], "temperature": 0.0, "avg_logprob": -0.17188859535447248, "compression_ratio": 1.6424050632911393, "no_speech_prob": 0.0006733807967975736}, {"id": 603, "seek": 51284, "start": 2886.88, "end": 2889.64, "text": " a physical merger of these companies that we're seeing here.", "tokens": [51066, 257, 4001, 48002, 295, 613, 3431, 300, 321, 434, 2577, 510, 13, 51204], "temperature": 0.0, "avg_logprob": -0.17188859535447248, "compression_ratio": 1.6424050632911393, "no_speech_prob": 0.0006733807967975736}, {"id": 604, "seek": 51284, "start": 2889.64, "end": 2891.08, "text": " So yeah, it's pretty wild.", "tokens": [51204, 407, 1338, 11, 309, 311, 1238, 4868, 13, 51276], "temperature": 0.0, "avg_logprob": -0.17188859535447248, "compression_ratio": 1.6424050632911393, "no_speech_prob": 0.0006733807967975736}, {"id": 605, "seek": 51284, "start": 2891.08, "end": 2894.4, "text": " It also means a really intense demand for memory", "tokens": [51276, 467, 611, 1355, 257, 534, 9447, 4733, 337, 4675, 51442], "temperature": 0.0, "avg_logprob": -0.17188859535447248, "compression_ratio": 1.6424050632911393, "no_speech_prob": 0.0006733807967975736}, {"id": 606, "seek": 51284, "start": 2894.4, "end": 2895.44, "text": " in the memory market.", "tokens": [51442, 294, 264, 4675, 2142, 13, 51494], "temperature": 0.0, "avg_logprob": -0.17188859535447248, "compression_ratio": 1.6424050632911393, "no_speech_prob": 0.0006733807967975736}, {"id": 607, "seek": 51284, "start": 2895.44, "end": 2896.6800000000003, "text": " Right now, I think SK Heinex,", "tokens": [51494, 1779, 586, 11, 286, 519, 21483, 634, 533, 87, 11, 51556], "temperature": 0.0, "avg_logprob": -0.17188859535447248, "compression_ratio": 1.6424050632911393, "no_speech_prob": 0.0006733807967975736}, {"id": 608, "seek": 51284, "start": 2896.6800000000003, "end": 2897.8, "text": " I saw something earlier today", "tokens": [51556, 286, 1866, 746, 3071, 965, 51612], "temperature": 0.0, "avg_logprob": -0.17188859535447248, "compression_ratio": 1.6424050632911393, "no_speech_prob": 0.0006733807967975736}, {"id": 609, "seek": 51284, "start": 2897.8, "end": 2899.8, "text": " that were like booked out until 2030,", "tokens": [51612, 300, 645, 411, 26735, 484, 1826, 28638, 11, 51712], "temperature": 0.0, "avg_logprob": -0.17188859535447248, "compression_ratio": 1.6424050632911393, "no_speech_prob": 0.0006733807967975736}, {"id": 610, "seek": 51284, "start": 2899.8, "end": 2902.6, "text": " they expect to have basically like way more", "tokens": [51712, 436, 2066, 281, 362, 1936, 411, 636, 544, 51852], "temperature": 0.0, "avg_logprob": -0.17188859535447248, "compression_ratio": 1.6424050632911393, "no_speech_prob": 0.0006733807967975736}, {"id": 611, "seek": 54260, "start": 2902.6, "end": 2904.92, "text": " demand supply all the way until 2030.", "tokens": [50364, 4733, 5847, 439, 264, 636, 1826, 28638, 13, 50480], "temperature": 0.0, "avg_logprob": -0.19811783246933276, "compression_ratio": 1.7763157894736843, "no_speech_prob": 0.0010436872253194451}, {"id": 612, "seek": 54260, "start": 2904.92, "end": 2906.32, "text": " So this is all part of that, right?", "tokens": [50480, 407, 341, 307, 439, 644, 295, 300, 11, 558, 30, 50550], "temperature": 0.0, "avg_logprob": -0.19811783246933276, "compression_ratio": 1.7763157894736843, "no_speech_prob": 0.0010436872253194451}, {"id": 613, "seek": 54260, "start": 2906.32, "end": 2907.8, "text": " People are realizing, holy shit,", "tokens": [50550, 3432, 366, 16734, 11, 10622, 4611, 11, 50624], "temperature": 0.0, "avg_logprob": -0.19811783246933276, "compression_ratio": 1.7763157894736843, "no_speech_prob": 0.0010436872253194451}, {"id": 614, "seek": 54260, "start": 2907.8, "end": 2909.96, "text": " like memory is a big deal here.", "tokens": [50624, 411, 4675, 307, 257, 955, 2028, 510, 13, 50732], "temperature": 0.0, "avg_logprob": -0.19811783246933276, "compression_ratio": 1.7763157894736843, "no_speech_prob": 0.0010436872253194451}, {"id": 615, "seek": 54260, "start": 2909.96, "end": 2912.36, "text": " And Nvidia on the inference side of the things,", "tokens": [50732, 400, 46284, 322, 264, 38253, 1252, 295, 264, 721, 11, 50852], "temperature": 0.0, "avg_logprob": -0.19811783246933276, "compression_ratio": 1.7763157894736843, "no_speech_prob": 0.0010436872253194451}, {"id": 616, "seek": 54260, "start": 2912.36, "end": 2914.04, "text": " you know, I think Jensen called himself", "tokens": [50852, 291, 458, 11, 286, 519, 508, 32934, 1219, 3647, 50936], "temperature": 0.0, "avg_logprob": -0.19811783246933276, "compression_ratio": 1.7763157894736843, "no_speech_prob": 0.0010436872253194451}, {"id": 617, "seek": 54260, "start": 2914.04, "end": 2915.6, "text": " what the inference king or something", "tokens": [50936, 437, 264, 38253, 4867, 420, 746, 51014], "temperature": 0.0, "avg_logprob": -0.19811783246933276, "compression_ratio": 1.7763157894736843, "no_speech_prob": 0.0010436872253194451}, {"id": 618, "seek": 54260, "start": 2915.6, "end": 2916.6, "text": " at these things where they're like,", "tokens": [51014, 412, 613, 721, 689, 436, 434, 411, 11, 51064], "temperature": 0.0, "avg_logprob": -0.19811783246933276, "compression_ratio": 1.7763157894736843, "no_speech_prob": 0.0010436872253194451}, {"id": 619, "seek": 54260, "start": 2916.6, "end": 2919.04, "text": " hey, so what's going on with this GROC architecture", "tokens": [51064, 4177, 11, 370, 437, 311, 516, 322, 365, 341, 460, 7142, 34, 9482, 51186], "temperature": 0.0, "avg_logprob": -0.19811783246933276, "compression_ratio": 1.7763157894736843, "no_speech_prob": 0.0010436872253194451}, {"id": 620, "seek": 54260, "start": 2919.04, "end": 2921.04, "text": " and he was like, hey, I'm the inference king.", "tokens": [51186, 293, 415, 390, 411, 11, 4177, 11, 286, 478, 264, 38253, 4867, 13, 51286], "temperature": 0.0, "avg_logprob": -0.19811783246933276, "compression_ratio": 1.7763157894736843, "no_speech_prob": 0.0010436872253194451}, {"id": 621, "seek": 54260, "start": 2921.04, "end": 2922.64, "text": " So this is part of that, right?", "tokens": [51286, 407, 341, 307, 644, 295, 300, 11, 558, 30, 51366], "temperature": 0.0, "avg_logprob": -0.19811783246933276, "compression_ratio": 1.7763157894736843, "no_speech_prob": 0.0010436872253194451}, {"id": 622, "seek": 54260, "start": 2922.64, "end": 2925.08, "text": " The GROC play is an inference play.", "tokens": [51366, 440, 460, 7142, 34, 862, 307, 364, 38253, 862, 13, 51488], "temperature": 0.0, "avg_logprob": -0.19811783246933276, "compression_ratio": 1.7763157894736843, "no_speech_prob": 0.0010436872253194451}, {"id": 623, "seek": 54260, "start": 2925.08, "end": 2927.88, "text": " And next going back to Mistral,", "tokens": [51488, 400, 958, 516, 646, 281, 20166, 2155, 11, 51628], "temperature": 0.0, "avg_logprob": -0.19811783246933276, "compression_ratio": 1.7763157894736843, "no_speech_prob": 0.0010436872253194451}, {"id": 624, "seek": 54260, "start": 2927.88, "end": 2930.4, "text": " one other aspect of what we have announced", "tokens": [51628, 472, 661, 4171, 295, 437, 321, 362, 7548, 51754], "temperature": 0.0, "avg_logprob": -0.19811783246933276, "compression_ratio": 1.7763157894736843, "no_speech_prob": 0.0010436872253194451}, {"id": 625, "seek": 57040, "start": 2930.4, "end": 2935.24, "text": " is also at GTC, they launched Forge,", "tokens": [50364, 307, 611, 412, 460, 18238, 11, 436, 8730, 1171, 432, 11, 50606], "temperature": 0.0, "avg_logprob": -0.13718706766764324, "compression_ratio": 1.6101694915254237, "no_speech_prob": 0.009860087186098099}, {"id": 626, "seek": 57040, "start": 2935.24, "end": 2938.2, "text": " which is a new offering by them", "tokens": [50606, 597, 307, 257, 777, 8745, 538, 552, 50754], "temperature": 0.0, "avg_logprob": -0.13718706766764324, "compression_ratio": 1.6101694915254237, "no_speech_prob": 0.009860087186098099}, {"id": 627, "seek": 57040, "start": 2938.2, "end": 2943.2, "text": " to let customers, businesses train their own AI models.", "tokens": [50754, 281, 718, 4581, 11, 6011, 3847, 641, 1065, 7318, 5245, 13, 51004], "temperature": 0.0, "avg_logprob": -0.13718706766764324, "compression_ratio": 1.6101694915254237, "no_speech_prob": 0.009860087186098099}, {"id": 628, "seek": 57040, "start": 2943.96, "end": 2947.08, "text": " It sounds like this supports various kinds of training.", "tokens": [51042, 467, 3263, 411, 341, 9346, 3683, 3685, 295, 3097, 13, 51198], "temperature": 0.0, "avg_logprob": -0.13718706766764324, "compression_ratio": 1.6101694915254237, "no_speech_prob": 0.009860087186098099}, {"id": 629, "seek": 57040, "start": 2947.08, "end": 2950.88, "text": " You can pre-train and have an entirely custom model", "tokens": [51198, 509, 393, 659, 12, 83, 7146, 293, 362, 364, 7696, 2375, 2316, 51388], "temperature": 0.0, "avg_logprob": -0.13718706766764324, "compression_ratio": 1.6101694915254237, "no_speech_prob": 0.009860087186098099}, {"id": 630, "seek": 57040, "start": 2950.88, "end": 2953.3199999999997, "text": " from scratch, which isn't something", "tokens": [51388, 490, 8459, 11, 597, 1943, 380, 746, 51510], "temperature": 0.0, "avg_logprob": -0.13718706766764324, "compression_ratio": 1.6101694915254237, "no_speech_prob": 0.009860087186098099}, {"id": 631, "seek": 57040, "start": 2953.3199999999997, "end": 2955.16, "text": " you would want for a lens,", "tokens": [51510, 291, 576, 528, 337, 257, 6765, 11, 51602], "temperature": 0.0, "avg_logprob": -0.13718706766764324, "compression_ratio": 1.6101694915254237, "no_speech_prob": 0.009860087186098099}, {"id": 632, "seek": 57040, "start": 2955.16, "end": 2957.2, "text": " but it sounds like maybe they allow it.", "tokens": [51602, 457, 309, 3263, 411, 1310, 436, 2089, 309, 13, 51704], "temperature": 0.0, "avg_logprob": -0.13718706766764324, "compression_ratio": 1.6101694915254237, "no_speech_prob": 0.009860087186098099}, {"id": 633, "seek": 57040, "start": 2957.2, "end": 2960.0, "text": " They also seem to be offering post-training,", "tokens": [51704, 814, 611, 1643, 281, 312, 8745, 2183, 12, 17227, 1760, 11, 51844], "temperature": 0.0, "avg_logprob": -0.13718706766764324, "compression_ratio": 1.6101694915254237, "no_speech_prob": 0.009860087186098099}, {"id": 634, "seek": 60000, "start": 2960.0, "end": 2961.6, "text": " reinforcement learning,", "tokens": [50364, 29280, 2539, 11, 50444], "temperature": 0.0, "avg_logprob": -0.20045169041706964, "compression_ratio": 1.6008403361344539, "no_speech_prob": 0.0007655269000679255}, {"id": 635, "seek": 60000, "start": 2961.6, "end": 2963.16, "text": " basically optimizing a model", "tokens": [50444, 1936, 40425, 257, 2316, 50522], "temperature": 0.0, "avg_logprob": -0.20045169041706964, "compression_ratio": 1.6008403361344539, "no_speech_prob": 0.0007655269000679255}, {"id": 636, "seek": 60000, "start": 2963.16, "end": 2966.08, "text": " for a given company's needs", "tokens": [50522, 337, 257, 2212, 2237, 311, 2203, 50668], "temperature": 0.0, "avg_logprob": -0.20045169041706964, "compression_ratio": 1.6008403361344539, "no_speech_prob": 0.0007655269000679255}, {"id": 637, "seek": 60000, "start": 2966.08, "end": 2969.6, "text": " with sort of all the stack and training knowledge", "tokens": [50668, 365, 1333, 295, 439, 264, 8630, 293, 3097, 3601, 50844], "temperature": 0.0, "avg_logprob": -0.20045169041706964, "compression_ratio": 1.6008403361344539, "no_speech_prob": 0.0007655269000679255}, {"id": 638, "seek": 60000, "start": 2969.6, "end": 2971.08, "text": " and inference and so on,", "tokens": [50844, 293, 38253, 293, 370, 322, 11, 50918], "temperature": 0.0, "avg_logprob": -0.20045169041706964, "compression_ratio": 1.6008403361344539, "no_speech_prob": 0.0007655269000679255}, {"id": 639, "seek": 60000, "start": 2971.08, "end": 2973.8, "text": " that is pretty kind of complex", "tokens": [50918, 300, 307, 1238, 733, 295, 3997, 51054], "temperature": 0.0, "avg_logprob": -0.20045169041706964, "compression_ratio": 1.6008403361344539, "no_speech_prob": 0.0007655269000679255}, {"id": 640, "seek": 60000, "start": 2973.8, "end": 2978.48, "text": " and is what Mistral and OpenAI and ProPick focus on.", "tokens": [51054, 293, 307, 437, 20166, 2155, 293, 7238, 48698, 293, 1705, 47, 618, 1879, 322, 13, 51288], "temperature": 0.0, "avg_logprob": -0.20045169041706964, "compression_ratio": 1.6008403361344539, "no_speech_prob": 0.0007655269000679255}, {"id": 641, "seek": 60000, "start": 2978.48, "end": 2981.56, "text": " A lot is just the training infrastructure", "tokens": [51288, 316, 688, 307, 445, 264, 3097, 6896, 51442], "temperature": 0.0, "avg_logprob": -0.20045169041706964, "compression_ratio": 1.6008403361344539, "no_speech_prob": 0.0007655269000679255}, {"id": 642, "seek": 60000, "start": 2981.56, "end": 2983.92, "text": " and setup and all of that.", "tokens": [51442, 293, 8657, 293, 439, 295, 300, 13, 51560], "temperature": 0.0, "avg_logprob": -0.20045169041706964, "compression_ratio": 1.6008403361344539, "no_speech_prob": 0.0007655269000679255}, {"id": 643, "seek": 60000, "start": 2983.92, "end": 2986.32, "text": " So they pitch it as, you know,", "tokens": [51560, 407, 436, 7293, 309, 382, 11, 291, 458, 11, 51680], "temperature": 0.0, "avg_logprob": -0.20045169041706964, "compression_ratio": 1.6008403361344539, "no_speech_prob": 0.0007655269000679255}, {"id": 644, "seek": 60000, "start": 2986.32, "end": 2989.76, "text": " you need the models to have your internal", "tokens": [51680, 291, 643, 264, 5245, 281, 362, 428, 6920, 51852], "temperature": 0.0, "avg_logprob": -0.20045169041706964, "compression_ratio": 1.6008403361344539, "no_speech_prob": 0.0007655269000679255}, {"id": 645, "seek": 62976, "start": 2989.76, "end": 2992.16, "text": " knowledge and information built in,", "tokens": [50364, 3601, 293, 1589, 3094, 294, 11, 50484], "temperature": 0.0, "avg_logprob": -0.17448855802416802, "compression_ratio": 1.6506550218340612, "no_speech_prob": 0.0004284574824851006}, {"id": 646, "seek": 62976, "start": 2992.16, "end": 2994.24, "text": " you can optimize it for your needs.", "tokens": [50484, 291, 393, 19719, 309, 337, 428, 2203, 13, 50588], "temperature": 0.0, "avg_logprob": -0.17448855802416802, "compression_ratio": 1.6506550218340612, "no_speech_prob": 0.0004284574824851006}, {"id": 647, "seek": 62976, "start": 2994.24, "end": 2997.0, "text": " I think this is an interesting offer", "tokens": [50588, 286, 519, 341, 307, 364, 1880, 2626, 50726], "temperature": 0.0, "avg_logprob": -0.17448855802416802, "compression_ratio": 1.6506550218340612, "no_speech_prob": 0.0004284574824851006}, {"id": 648, "seek": 62976, "start": 2997.0, "end": 3000.44, "text": " and potential here with open source models", "tokens": [50726, 293, 3995, 510, 365, 1269, 4009, 5245, 50898], "temperature": 0.0, "avg_logprob": -0.17448855802416802, "compression_ratio": 1.6506550218340612, "no_speech_prob": 0.0004284574824851006}, {"id": 649, "seek": 62976, "start": 3000.44, "end": 3003.04, "text": " haven't started to get good,", "tokens": [50898, 2378, 380, 1409, 281, 483, 665, 11, 51028], "temperature": 0.0, "avg_logprob": -0.17448855802416802, "compression_ratio": 1.6506550218340612, "no_speech_prob": 0.0004284574824851006}, {"id": 650, "seek": 62976, "start": 3003.04, "end": 3005.7200000000003, "text": " with especially Clan models, Mistral models,", "tokens": [51028, 365, 2318, 45117, 5245, 11, 20166, 2155, 5245, 11, 51162], "temperature": 0.0, "avg_logprob": -0.17448855802416802, "compression_ratio": 1.6506550218340612, "no_speech_prob": 0.0004284574824851006}, {"id": 651, "seek": 62976, "start": 3005.7200000000003, "end": 3009.4, "text": " to some extent, it is a potentially good bet", "tokens": [51162, 281, 512, 8396, 11, 309, 307, 257, 7263, 665, 778, 51346], "temperature": 0.0, "avg_logprob": -0.17448855802416802, "compression_ratio": 1.6506550218340612, "no_speech_prob": 0.0004284574824851006}, {"id": 652, "seek": 62976, "start": 3009.4, "end": 3012.04, "text": " that as open source models get better,", "tokens": [51346, 300, 382, 1269, 4009, 5245, 483, 1101, 11, 51478], "temperature": 0.0, "avg_logprob": -0.17448855802416802, "compression_ratio": 1.6506550218340612, "no_speech_prob": 0.0004284574824851006}, {"id": 653, "seek": 62976, "start": 3012.04, "end": 3016.12, "text": " you would see more people fine-tuning", "tokens": [51478, 291, 576, 536, 544, 561, 2489, 12, 83, 37726, 51682], "temperature": 0.0, "avg_logprob": -0.17448855802416802, "compression_ratio": 1.6506550218340612, "no_speech_prob": 0.0004284574824851006}, {"id": 654, "seek": 62976, "start": 3016.12, "end": 3017.92, "text": " their own models post-training", "tokens": [51682, 641, 1065, 5245, 2183, 12, 17227, 1760, 51772], "temperature": 0.0, "avg_logprob": -0.17448855802416802, "compression_ratio": 1.6506550218340612, "no_speech_prob": 0.0004284574824851006}, {"id": 655, "seek": 65792, "start": 3017.92, "end": 3021.68, "text": " and reinforcement learning for their own agents", "tokens": [50364, 293, 29280, 2539, 337, 641, 1065, 12554, 50552], "temperature": 0.0, "avg_logprob": -0.13990101102486396, "compression_ratio": 1.646808510638298, "no_speech_prob": 0.0027282771188765764}, {"id": 656, "seek": 65792, "start": 3021.68, "end": 3023.56, "text": " and their own flows,", "tokens": [50552, 293, 641, 1065, 12867, 11, 50646], "temperature": 0.0, "avg_logprob": -0.13990101102486396, "compression_ratio": 1.646808510638298, "no_speech_prob": 0.0027282771188765764}, {"id": 657, "seek": 65792, "start": 3023.56, "end": 3026.0, "text": " which ultimately that's the best way", "tokens": [50646, 597, 6284, 300, 311, 264, 1151, 636, 50768], "temperature": 0.0, "avg_logprob": -0.13990101102486396, "compression_ratio": 1.646808510638298, "no_speech_prob": 0.0027282771188765764}, {"id": 658, "seek": 65792, "start": 3026.0, "end": 3028.88, "text": " to unlock performance is training.", "tokens": [50768, 281, 11634, 3389, 307, 3097, 13, 50912], "temperature": 0.0, "avg_logprob": -0.13990101102486396, "compression_ratio": 1.646808510638298, "no_speech_prob": 0.0027282771188765764}, {"id": 659, "seek": 65792, "start": 3028.88, "end": 3030.96, "text": " You can do, you know, prompted engineering", "tokens": [50912, 509, 393, 360, 11, 291, 458, 11, 31042, 7043, 51016], "temperature": 0.0, "avg_logprob": -0.13990101102486396, "compression_ratio": 1.646808510638298, "no_speech_prob": 0.0027282771188765764}, {"id": 660, "seek": 65792, "start": 3030.96, "end": 3032.16, "text": " and you can do whatever,", "tokens": [51016, 293, 291, 393, 360, 2035, 11, 51076], "temperature": 0.0, "avg_logprob": -0.13990101102486396, "compression_ratio": 1.646808510638298, "no_speech_prob": 0.0027282771188765764}, {"id": 661, "seek": 65792, "start": 3032.16, "end": 3034.96, "text": " but training on data is key.", "tokens": [51076, 457, 3097, 322, 1412, 307, 2141, 13, 51216], "temperature": 0.0, "avg_logprob": -0.13990101102486396, "compression_ratio": 1.646808510638298, "no_speech_prob": 0.0027282771188765764}, {"id": 662, "seek": 65792, "start": 3034.96, "end": 3036.56, "text": " So they just announced it,", "tokens": [51216, 407, 436, 445, 7548, 309, 11, 51296], "temperature": 0.0, "avg_logprob": -0.13990101102486396, "compression_ratio": 1.646808510638298, "no_speech_prob": 0.0027282771188765764}, {"id": 663, "seek": 65792, "start": 3036.56, "end": 3039.36, "text": " you need to sign up to get more details,", "tokens": [51296, 291, 643, 281, 1465, 493, 281, 483, 544, 4365, 11, 51436], "temperature": 0.0, "avg_logprob": -0.13990101102486396, "compression_ratio": 1.646808510638298, "no_speech_prob": 0.0027282771188765764}, {"id": 664, "seek": 65792, "start": 3039.36, "end": 3041.88, "text": " the details aren't super precise yet,", "tokens": [51436, 264, 4365, 3212, 380, 1687, 13600, 1939, 11, 51562], "temperature": 0.0, "avg_logprob": -0.13990101102486396, "compression_ratio": 1.646808510638298, "no_speech_prob": 0.0027282771188765764}, {"id": 665, "seek": 65792, "start": 3041.88, "end": 3044.48, "text": " but it appears that they are shifting focus", "tokens": [51562, 457, 309, 7038, 300, 436, 366, 17573, 1879, 51692], "temperature": 0.0, "avg_logprob": -0.13990101102486396, "compression_ratio": 1.646808510638298, "no_speech_prob": 0.0027282771188765764}, {"id": 666, "seek": 68448, "start": 3044.48, "end": 3048.4, "text": " or at least honing in on this specific strategy", "tokens": [50364, 420, 412, 1935, 2157, 278, 294, 322, 341, 2685, 5206, 50560], "temperature": 0.0, "avg_logprob": -0.19112323390112984, "compression_ratio": 1.6563573883161513, "no_speech_prob": 0.11039875447750092}, {"id": 667, "seek": 68448, "start": 3048.4, "end": 3052.24, "text": " to try and differentiate and compete in this space.", "tokens": [50560, 281, 853, 293, 23203, 293, 11831, 294, 341, 1901, 13, 50752], "temperature": 0.0, "avg_logprob": -0.19112323390112984, "compression_ratio": 1.6563573883161513, "no_speech_prob": 0.11039875447750092}, {"id": 668, "seek": 68448, "start": 3052.24, "end": 3054.88, "text": " Yeah, I'd like to be upfront about my biases here.", "tokens": [50752, 865, 11, 286, 1116, 411, 281, 312, 30264, 466, 452, 32152, 510, 13, 50884], "temperature": 0.0, "avg_logprob": -0.19112323390112984, "compression_ratio": 1.6563573883161513, "no_speech_prob": 0.11039875447750092}, {"id": 669, "seek": 68448, "start": 3054.88, "end": 3056.8, "text": " Everyone, if you listen to the podcast a lot,", "tokens": [50884, 5198, 11, 498, 291, 2140, 281, 264, 7367, 257, 688, 11, 50980], "temperature": 0.0, "avg_logprob": -0.19112323390112984, "compression_ratio": 1.6563573883161513, "no_speech_prob": 0.11039875447750092}, {"id": 670, "seek": 68448, "start": 3056.8, "end": 3059.7200000000003, "text": " you know I'm pretty bearish on Mistral and Coheer", "tokens": [50980, 291, 458, 286, 478, 1238, 6155, 742, 322, 20166, 2155, 293, 3066, 675, 260, 51126], "temperature": 0.0, "avg_logprob": -0.19112323390112984, "compression_ratio": 1.6563573883161513, "no_speech_prob": 0.11039875447750092}, {"id": 671, "seek": 68448, "start": 3059.7200000000003, "end": 3063.12, "text": " and those sorts of run-up to the frontier companies", "tokens": [51126, 293, 729, 7527, 295, 1190, 12, 1010, 281, 264, 1868, 811, 3431, 51296], "temperature": 0.0, "avg_logprob": -0.19112323390112984, "compression_ratio": 1.6563573883161513, "no_speech_prob": 0.11039875447750092}, {"id": 672, "seek": 68448, "start": 3063.12, "end": 3064.36, "text": " for a whole bunch of reasons,", "tokens": [51296, 337, 257, 1379, 3840, 295, 4112, 11, 51358], "temperature": 0.0, "avg_logprob": -0.19112323390112984, "compression_ratio": 1.6563573883161513, "no_speech_prob": 0.11039875447750092}, {"id": 673, "seek": 68448, "start": 3064.36, "end": 3067.76, "text": " but this is not making me any more excited", "tokens": [51358, 457, 341, 307, 406, 1455, 385, 604, 544, 2919, 51528], "temperature": 0.0, "avg_logprob": -0.19112323390112984, "compression_ratio": 1.6563573883161513, "no_speech_prob": 0.11039875447750092}, {"id": 674, "seek": 68448, "start": 3067.76, "end": 3069.36, "text": " about this direction.", "tokens": [51528, 466, 341, 3513, 13, 51608], "temperature": 0.0, "avg_logprob": -0.19112323390112984, "compression_ratio": 1.6563573883161513, "no_speech_prob": 0.11039875447750092}, {"id": 675, "seek": 68448, "start": 3069.36, "end": 3070.7200000000003, "text": " The reason is, so first of all,", "tokens": [51608, 440, 1778, 307, 11, 370, 700, 295, 439, 11, 51676], "temperature": 0.0, "avg_logprob": -0.19112323390112984, "compression_ratio": 1.6563573883161513, "no_speech_prob": 0.11039875447750092}, {"id": 676, "seek": 68448, "start": 3070.7200000000003, "end": 3073.12, "text": " this position's done basically as competitors to Coheer.", "tokens": [51676, 341, 2535, 311, 1096, 1936, 382, 18333, 281, 3066, 675, 260, 13, 51796], "temperature": 0.0, "avg_logprob": -0.19112323390112984, "compression_ratio": 1.6563573883161513, "no_speech_prob": 0.11039875447750092}, {"id": 677, "seek": 71312, "start": 3073.12, "end": 3074.52, "text": " This is the Coheer play, right?", "tokens": [50364, 639, 307, 264, 3066, 675, 260, 862, 11, 558, 30, 50434], "temperature": 0.0, "avg_logprob": -0.1420658603310585, "compression_ratio": 1.704225352112676, "no_speech_prob": 0.003624165430665016}, {"id": 678, "seek": 71312, "start": 3074.52, "end": 3075.36, "text": " This is it.", "tokens": [50434, 639, 307, 309, 13, 50476], "temperature": 0.0, "avg_logprob": -0.1420658603310585, "compression_ratio": 1.704225352112676, "no_speech_prob": 0.003624165430665016}, {"id": 679, "seek": 71312, "start": 3075.36, "end": 3077.08, "text": " You're basically saying enterprises,", "tokens": [50476, 509, 434, 1936, 1566, 29034, 11, 50562], "temperature": 0.0, "avg_logprob": -0.1420658603310585, "compression_ratio": 1.704225352112676, "no_speech_prob": 0.003624165430665016}, {"id": 680, "seek": 71312, "start": 3077.08, "end": 3079.2, "text": " like we're just gonna be better at enterprise integration.", "tokens": [50562, 411, 321, 434, 445, 799, 312, 1101, 412, 14132, 10980, 13, 50668], "temperature": 0.0, "avg_logprob": -0.1420658603310585, "compression_ratio": 1.704225352112676, "no_speech_prob": 0.003624165430665016}, {"id": 681, "seek": 71312, "start": 3079.2, "end": 3081.24, "text": " We're gonna make a stack for the enterprise.", "tokens": [50668, 492, 434, 799, 652, 257, 8630, 337, 264, 14132, 13, 50770], "temperature": 0.0, "avg_logprob": -0.1420658603310585, "compression_ratio": 1.704225352112676, "no_speech_prob": 0.003624165430665016}, {"id": 682, "seek": 71312, "start": 3081.24, "end": 3083.6, "text": " The difference is Mistral is a few miles", "tokens": [50770, 440, 2649, 307, 20166, 2155, 307, 257, 1326, 6193, 50888], "temperature": 0.0, "avg_logprob": -0.1420658603310585, "compression_ratio": 1.704225352112676, "no_speech_prob": 0.003624165430665016}, {"id": 683, "seek": 71312, "start": 3083.6, "end": 3085.24, "text": " behind Coheer right now, right?", "tokens": [50888, 2261, 3066, 675, 260, 558, 586, 11, 558, 30, 50970], "temperature": 0.0, "avg_logprob": -0.1420658603310585, "compression_ratio": 1.704225352112676, "no_speech_prob": 0.003624165430665016}, {"id": 684, "seek": 71312, "start": 3085.24, "end": 3087.32, "text": " Coheer has been working on this for a long time.", "tokens": [50970, 3066, 675, 260, 575, 668, 1364, 322, 341, 337, 257, 938, 565, 13, 51074], "temperature": 0.0, "avg_logprob": -0.1420658603310585, "compression_ratio": 1.704225352112676, "no_speech_prob": 0.003624165430665016}, {"id": 685, "seek": 71312, "start": 3087.32, "end": 3088.68, "text": " They hit like a quarter billion dollars", "tokens": [51074, 814, 2045, 411, 257, 6555, 5218, 3808, 51142], "temperature": 0.0, "avg_logprob": -0.1420658603310585, "compression_ratio": 1.704225352112676, "no_speech_prob": 0.003624165430665016}, {"id": 686, "seek": 71312, "start": 3088.68, "end": 3090.6, "text": " in recurring revenue in 2025.", "tokens": [51142, 294, 32279, 9324, 294, 39209, 13, 51238], "temperature": 0.0, "avg_logprob": -0.1420658603310585, "compression_ratio": 1.704225352112676, "no_speech_prob": 0.003624165430665016}, {"id": 687, "seek": 71312, "start": 3090.6, "end": 3092.56, "text": " It was some decent quarter on quarter growth,", "tokens": [51238, 467, 390, 512, 8681, 6555, 322, 6555, 4599, 11, 51336], "temperature": 0.0, "avg_logprob": -0.1420658603310585, "compression_ratio": 1.704225352112676, "no_speech_prob": 0.003624165430665016}, {"id": 688, "seek": 71312, "start": 3092.56, "end": 3094.92, "text": " though I think ultimately they're gonna struggle", "tokens": [51336, 1673, 286, 519, 6284, 436, 434, 799, 7799, 51454], "temperature": 0.0, "avg_logprob": -0.1420658603310585, "compression_ratio": 1.704225352112676, "no_speech_prob": 0.003624165430665016}, {"id": 689, "seek": 71312, "start": 3094.92, "end": 3096.92, "text": " to kind of be competitive in the world", "tokens": [51454, 281, 733, 295, 312, 10043, 294, 264, 1002, 51554], "temperature": 0.0, "avg_logprob": -0.1420658603310585, "compression_ratio": 1.704225352112676, "no_speech_prob": 0.003624165430665016}, {"id": 690, "seek": 71312, "start": 3096.92, "end": 3098.6, "text": " of like big infrastructure plays here,", "tokens": [51554, 295, 411, 955, 6896, 5749, 510, 11, 51638], "temperature": 0.0, "avg_logprob": -0.1420658603310585, "compression_ratio": 1.704225352112676, "no_speech_prob": 0.003624165430665016}, {"id": 691, "seek": 71312, "start": 3098.6, "end": 3101.08, "text": " but that's kind of fundamentally one of the challenges.", "tokens": [51638, 457, 300, 311, 733, 295, 17879, 472, 295, 264, 4759, 13, 51762], "temperature": 0.0, "avg_logprob": -0.1420658603310585, "compression_ratio": 1.704225352112676, "no_speech_prob": 0.003624165430665016}, {"id": 692, "seek": 74108, "start": 3101.08, "end": 3103.48, "text": " The other one is the companies that have", "tokens": [50364, 440, 661, 472, 307, 264, 3431, 300, 362, 50484], "temperature": 0.0, "avg_logprob": -0.264997784892718, "compression_ratio": 1.6978193146417446, "no_speech_prob": 0.001387892640195787}, {"id": 693, "seek": 74108, "start": 3103.48, "end": 3106.08, "text": " an enduring advantage on infrastructure,", "tokens": [50484, 364, 36562, 5002, 322, 6896, 11, 50614], "temperature": 0.0, "avg_logprob": -0.264997784892718, "compression_ratio": 1.6978193146417446, "no_speech_prob": 0.001387892640195787}, {"id": 694, "seek": 74108, "start": 3106.08, "end": 3107.6, "text": " the anthropics of the world, the Googles,", "tokens": [50614, 264, 22727, 1167, 295, 264, 1002, 11, 264, 45005, 904, 11, 50690], "temperature": 0.0, "avg_logprob": -0.264997784892718, "compression_ratio": 1.6978193146417446, "no_speech_prob": 0.001387892640195787}, {"id": 695, "seek": 74108, "start": 3107.6, "end": 3108.96, "text": " the opening eyes of the world,", "tokens": [50690, 264, 5193, 2575, 295, 264, 1002, 11, 50758], "temperature": 0.0, "avg_logprob": -0.264997784892718, "compression_ratio": 1.6978193146417446, "no_speech_prob": 0.001387892640195787}, {"id": 696, "seek": 74108, "start": 3108.96, "end": 3111.36, "text": " are ultimately, if they ever want to,", "tokens": [50758, 366, 6284, 11, 498, 436, 1562, 528, 281, 11, 50878], "temperature": 0.0, "avg_logprob": -0.264997784892718, "compression_ratio": 1.6978193146417446, "no_speech_prob": 0.001387892640195787}, {"id": 697, "seek": 74108, "start": 3111.36, "end": 3114.44, "text": " in a position to just eat Mistral and Coheer's lunch.", "tokens": [50878, 294, 257, 2535, 281, 445, 1862, 20166, 2155, 293, 3066, 675, 260, 311, 6349, 13, 51032], "temperature": 0.0, "avg_logprob": -0.264997784892718, "compression_ratio": 1.6978193146417446, "no_speech_prob": 0.001387892640195787}, {"id": 698, "seek": 74108, "start": 3114.44, "end": 3115.7200000000003, "text": " They can just go in and one day say,", "tokens": [51032, 814, 393, 445, 352, 294, 293, 472, 786, 584, 11, 51096], "temperature": 0.0, "avg_logprob": -0.264997784892718, "compression_ratio": 1.6978193146417446, "no_speech_prob": 0.001387892640195787}, {"id": 699, "seek": 74108, "start": 3115.7200000000003, "end": 3118.44, "text": " hey, you know what, we decided that we really care about this.", "tokens": [51096, 4177, 11, 291, 458, 437, 11, 321, 3047, 300, 321, 534, 1127, 466, 341, 13, 51232], "temperature": 0.0, "avg_logprob": -0.264997784892718, "compression_ratio": 1.6978193146417446, "no_speech_prob": 0.001387892640195787}, {"id": 700, "seek": 74108, "start": 3118.44, "end": 3120.08, "text": " We're gonna make special models,", "tokens": [51232, 492, 434, 799, 652, 2121, 5245, 11, 51314], "temperature": 0.0, "avg_logprob": -0.264997784892718, "compression_ratio": 1.6978193146417446, "no_speech_prob": 0.001387892640195787}, {"id": 701, "seek": 74108, "start": 3120.08, "end": 3122.76, "text": " maybe distill versions of our actual frontier models,", "tokens": [51314, 1310, 42923, 9606, 295, 527, 3539, 35853, 5245, 11, 51448], "temperature": 0.0, "avg_logprob": -0.264997784892718, "compression_ratio": 1.6978193146417446, "no_speech_prob": 0.001387892640195787}, {"id": 702, "seek": 74108, "start": 3122.76, "end": 3124.08, "text": " whatever to run locally on your thing,", "tokens": [51448, 2035, 281, 1190, 16143, 322, 428, 551, 11, 51514], "temperature": 0.0, "avg_logprob": -0.264997784892718, "compression_ratio": 1.6978193146417446, "no_speech_prob": 0.001387892640195787}, {"id": 703, "seek": 74108, "start": 3124.08, "end": 3125.08, "text": " or we're gonna create new things.", "tokens": [51514, 420, 321, 434, 799, 1884, 777, 721, 13, 51564], "temperature": 0.0, "avg_logprob": -0.264997784892718, "compression_ratio": 1.6978193146417446, "no_speech_prob": 0.001387892640195787}, {"id": 704, "seek": 74108, "start": 3125.08, "end": 3127.8, "text": " And opening a has in the past offered,", "tokens": [51564, 400, 5193, 257, 575, 294, 264, 1791, 8059, 11, 51700], "temperature": 0.0, "avg_logprob": -0.264997784892718, "compression_ratio": 1.6978193146417446, "no_speech_prob": 0.001387892640195787}, {"id": 705, "seek": 76780, "start": 3127.8, "end": 3131.7200000000003, "text": " a fine tuning of the finger GPD 4.1.", "tokens": [50364, 257, 2489, 15164, 295, 264, 5984, 460, 17349, 1017, 13, 16, 13, 50560], "temperature": 0.0, "avg_logprob": -0.2504000878759793, "compression_ratio": 1.6331168831168832, "no_speech_prob": 0.001549702719785273}, {"id": 706, "seek": 76780, "start": 3131.7200000000003, "end": 3133.92, "text": " They don't allow to have the latest set of models,", "tokens": [50560, 814, 500, 380, 2089, 281, 362, 264, 6792, 992, 295, 5245, 11, 50670], "temperature": 0.0, "avg_logprob": -0.2504000878759793, "compression_ratio": 1.6331168831168832, "no_speech_prob": 0.001549702719785273}, {"id": 707, "seek": 76780, "start": 3133.92, "end": 3136.12, "text": " but it used to be something to offer.", "tokens": [50670, 457, 309, 1143, 281, 312, 746, 281, 2626, 13, 50780], "temperature": 0.0, "avg_logprob": -0.2504000878759793, "compression_ratio": 1.6331168831168832, "no_speech_prob": 0.001549702719785273}, {"id": 708, "seek": 76780, "start": 3136.12, "end": 3136.96, "text": " Yeah, exactly.", "tokens": [50780, 865, 11, 2293, 13, 50822], "temperature": 0.0, "avg_logprob": -0.2504000878759793, "compression_ratio": 1.6331168831168832, "no_speech_prob": 0.001549702719785273}, {"id": 709, "seek": 76780, "start": 3136.96, "end": 3138.08, "text": " And so ultimately, I mean,", "tokens": [50822, 400, 370, 6284, 11, 286, 914, 11, 50878], "temperature": 0.0, "avg_logprob": -0.2504000878759793, "compression_ratio": 1.6331168831168832, "no_speech_prob": 0.001549702719785273}, {"id": 710, "seek": 76780, "start": 3138.08, "end": 3140.6800000000003, "text": " this is the challenges you are going to end up competing", "tokens": [50878, 341, 307, 264, 4759, 291, 366, 516, 281, 917, 493, 15439, 51008], "temperature": 0.0, "avg_logprob": -0.2504000878759793, "compression_ratio": 1.6331168831168832, "no_speech_prob": 0.001549702719785273}, {"id": 711, "seek": 76780, "start": 3140.6800000000003, "end": 3143.36, "text": " with them and the question will at some point be,", "tokens": [51008, 365, 552, 293, 264, 1168, 486, 412, 512, 935, 312, 11, 51142], "temperature": 0.0, "avg_logprob": -0.2504000878759793, "compression_ratio": 1.6331168831168832, "no_speech_prob": 0.001549702719785273}, {"id": 712, "seek": 76780, "start": 3143.36, "end": 3145.44, "text": " because we know this is where margin is made,", "tokens": [51142, 570, 321, 458, 341, 307, 689, 10270, 307, 1027, 11, 51246], "temperature": 0.0, "avg_logprob": -0.2504000878759793, "compression_ratio": 1.6331168831168832, "no_speech_prob": 0.001549702719785273}, {"id": 713, "seek": 76780, "start": 3145.44, "end": 3148.4, "text": " quality versus quality at the frontier of capabilities.", "tokens": [51246, 3125, 5717, 3125, 412, 264, 35853, 295, 10862, 13, 51394], "temperature": 0.0, "avg_logprob": -0.2504000878759793, "compression_ratio": 1.6331168831168832, "no_speech_prob": 0.001549702719785273}, {"id": 714, "seek": 76780, "start": 3148.4, "end": 3150.7200000000003, "text": " If that ends up being the case, I mean, damn,", "tokens": [51394, 759, 300, 5314, 493, 885, 264, 1389, 11, 286, 914, 11, 8151, 11, 51510], "temperature": 0.0, "avg_logprob": -0.2504000878759793, "compression_ratio": 1.6331168831168832, "no_speech_prob": 0.001549702719785273}, {"id": 715, "seek": 76780, "start": 3150.7200000000003, "end": 3151.92, "text": " like this is an uphill battle,", "tokens": [51510, 411, 341, 307, 364, 39132, 4635, 11, 51570], "temperature": 0.0, "avg_logprob": -0.2504000878759793, "compression_ratio": 1.6331168831168832, "no_speech_prob": 0.001549702719785273}, {"id": 716, "seek": 76780, "start": 3151.92, "end": 3154.44, "text": " because you're actually gonna be like opening eye", "tokens": [51570, 570, 291, 434, 767, 799, 312, 411, 5193, 3313, 51696], "temperature": 0.0, "avg_logprob": -0.2504000878759793, "compression_ratio": 1.6331168831168832, "no_speech_prob": 0.001549702719785273}, {"id": 717, "seek": 79444, "start": 3154.6, "end": 3158.16, "text": " and anthropic get to amortize the massive cost", "tokens": [50372, 293, 22727, 299, 483, 281, 669, 477, 1125, 264, 5994, 2063, 50550], "temperature": 0.0, "avg_logprob": -0.19076736764730634, "compression_ratio": 1.641399416909621, "no_speech_prob": 0.008461395278573036}, {"id": 718, "seek": 79444, "start": 3158.16, "end": 3161.16, "text": " of training frontier models across every inference run", "tokens": [50550, 295, 3097, 35853, 5245, 2108, 633, 38253, 1190, 50700], "temperature": 0.0, "avg_logprob": -0.19076736764730634, "compression_ratio": 1.641399416909621, "no_speech_prob": 0.008461395278573036}, {"id": 719, "seek": 79444, "start": 3161.16, "end": 3163.28, "text": " that gets done on their infrastructure.", "tokens": [50700, 300, 2170, 1096, 322, 641, 6896, 13, 50806], "temperature": 0.0, "avg_logprob": -0.19076736764730634, "compression_ratio": 1.641399416909621, "no_speech_prob": 0.008461395278573036}, {"id": 720, "seek": 79444, "start": 3163.28, "end": 3165.92, "text": " And if they're, you know, doing distillates of those models", "tokens": [50806, 400, 498, 436, 434, 11, 291, 458, 11, 884, 42923, 1024, 295, 729, 5245, 50938], "temperature": 0.0, "avg_logprob": -0.19076736764730634, "compression_ratio": 1.641399416909621, "no_speech_prob": 0.008461395278573036}, {"id": 721, "seek": 79444, "start": 3165.92, "end": 3167.92, "text": " to serve at the enterprise, the same thing apply to like,", "tokens": [50938, 281, 4596, 412, 264, 14132, 11, 264, 912, 551, 3079, 281, 411, 11, 51038], "temperature": 0.0, "avg_logprob": -0.19076736764730634, "compression_ratio": 1.641399416909621, "no_speech_prob": 0.008461395278573036}, {"id": 722, "seek": 79444, "start": 3167.92, "end": 3170.84, "text": " the economics look pretty bad to me for stuff like this.", "tokens": [51038, 264, 14564, 574, 1238, 1578, 281, 385, 337, 1507, 411, 341, 13, 51184], "temperature": 0.0, "avg_logprob": -0.19076736764730634, "compression_ratio": 1.641399416909621, "no_speech_prob": 0.008461395278573036}, {"id": 723, "seek": 79444, "start": 3170.84, "end": 3172.48, "text": " But ultimately, you know, we'll see,", "tokens": [51184, 583, 6284, 11, 291, 458, 11, 321, 603, 536, 11, 51266], "temperature": 0.0, "avg_logprob": -0.19076736764730634, "compression_ratio": 1.641399416909621, "no_speech_prob": 0.008461395278573036}, {"id": 724, "seek": 79444, "start": 3172.48, "end": 3173.4, "text": " this is also by the way,", "tokens": [51266, 341, 307, 611, 538, 264, 636, 11, 51312], "temperature": 0.0, "avg_logprob": -0.19076736764730634, "compression_ratio": 1.641399416909621, "no_speech_prob": 0.008461395278573036}, {"id": 725, "seek": 79444, "start": 3173.4, "end": 3176.56, "text": " a mistrial kind of playing into their European roots a bit.", "tokens": [51312, 257, 3544, 7111, 733, 295, 2433, 666, 641, 6473, 10669, 257, 857, 13, 51470], "temperature": 0.0, "avg_logprob": -0.19076736764730634, "compression_ratio": 1.641399416909621, "no_speech_prob": 0.008461395278573036}, {"id": 726, "seek": 79444, "start": 3176.56, "end": 3179.56, "text": " So co here has a sort of Canadian footprint.", "tokens": [51470, 407, 598, 510, 575, 257, 1333, 295, 12641, 24222, 13, 51620], "temperature": 0.0, "avg_logprob": -0.19076736764730634, "compression_ratio": 1.641399416909621, "no_speech_prob": 0.008461395278573036}, {"id": 727, "seek": 79444, "start": 3179.56, "end": 3182.0, "text": " They've got big deal EU ambitions,", "tokens": [51620, 814, 600, 658, 955, 2028, 10887, 34475, 11, 51742], "temperature": 0.0, "avg_logprob": -0.19076736764730634, "compression_ratio": 1.641399416909621, "no_speech_prob": 0.008461395278573036}, {"id": 728, "seek": 79444, "start": 3182.0, "end": 3184.2, "text": " but mistrial is through the French champion.", "tokens": [51742, 457, 3544, 7111, 307, 807, 264, 5522, 10971, 13, 51852], "temperature": 0.0, "avg_logprob": -0.19076736764730634, "compression_ratio": 1.641399416909621, "no_speech_prob": 0.008461395278573036}, {"id": 729, "seek": 82420, "start": 3184.2, "end": 3186.48, "text": " And they've very much been seen as the sovereign champion.", "tokens": [50364, 400, 436, 600, 588, 709, 668, 1612, 382, 264, 28756, 10971, 13, 50478], "temperature": 0.0, "avg_logprob": -0.16008812697633865, "compression_ratio": 1.636963696369637, "no_speech_prob": 0.00016611248429398984}, {"id": 730, "seek": 82420, "start": 3186.48, "end": 3188.8, "text": " So data sovereignty, model sovereignty,", "tokens": [50478, 407, 1412, 27862, 11, 2316, 27862, 11, 50594], "temperature": 0.0, "avg_logprob": -0.16008812697633865, "compression_ratio": 1.636963696369637, "no_speech_prob": 0.00016611248429398984}, {"id": 731, "seek": 82420, "start": 3188.8, "end": 3190.56, "text": " you'll hear them say a lot of that.", "tokens": [50594, 291, 603, 1568, 552, 584, 257, 688, 295, 300, 13, 50682], "temperature": 0.0, "avg_logprob": -0.16008812697633865, "compression_ratio": 1.636963696369637, "no_speech_prob": 0.00016611248429398984}, {"id": 732, "seek": 82420, "start": 3190.56, "end": 3193.24, "text": " Ultimately, from a free market standpoint,", "tokens": [50682, 23921, 11, 490, 257, 1737, 2142, 15827, 11, 50816], "temperature": 0.0, "avg_logprob": -0.16008812697633865, "compression_ratio": 1.636963696369637, "no_speech_prob": 0.00016611248429398984}, {"id": 733, "seek": 82420, "start": 3193.24, "end": 3196.52, "text": " that's basically just an appeal to like kind of European", "tokens": [50816, 300, 311, 1936, 445, 364, 13668, 281, 411, 733, 295, 6473, 50980], "temperature": 0.0, "avg_logprob": -0.16008812697633865, "compression_ratio": 1.636963696369637, "no_speech_prob": 0.00016611248429398984}, {"id": 734, "seek": 82420, "start": 3196.52, "end": 3199.08, "text": " or French nationalism to kind of help float them", "tokens": [50980, 420, 5522, 39186, 281, 733, 295, 854, 15706, 552, 51108], "temperature": 0.0, "avg_logprob": -0.16008812697633865, "compression_ratio": 1.636963696369637, "no_speech_prob": 0.00016611248429398984}, {"id": 735, "seek": 82420, "start": 3199.08, "end": 3203.28, "text": " on this otherwise potentially market challenge play.", "tokens": [51108, 322, 341, 5911, 7263, 2142, 3430, 862, 13, 51318], "temperature": 0.0, "avg_logprob": -0.16008812697633865, "compression_ratio": 1.636963696369637, "no_speech_prob": 0.00016611248429398984}, {"id": 736, "seek": 82420, "start": 3203.28, "end": 3206.36, "text": " I think, yeah, maybe I would say you're a little bit", "tokens": [51318, 286, 519, 11, 1338, 11, 1310, 286, 576, 584, 291, 434, 257, 707, 857, 51472], "temperature": 0.0, "avg_logprob": -0.16008812697633865, "compression_ratio": 1.636963696369637, "no_speech_prob": 0.00016611248429398984}, {"id": 737, "seek": 82420, "start": 3206.36, "end": 3207.92, "text": " caricaturing them,", "tokens": [51472, 45732, 267, 1345, 552, 11, 51550], "temperature": 0.0, "avg_logprob": -0.16008812697633865, "compression_ratio": 1.636963696369637, "no_speech_prob": 0.00016611248429398984}, {"id": 738, "seek": 82420, "start": 3207.92, "end": 3211.08, "text": " they position it more as control over IP.", "tokens": [51550, 436, 2535, 309, 544, 382, 1969, 670, 8671, 13, 51708], "temperature": 0.0, "avg_logprob": -0.16008812697633865, "compression_ratio": 1.636963696369637, "no_speech_prob": 0.00016611248429398984}, {"id": 739, "seek": 82420, "start": 3211.08, "end": 3213.48, "text": " And this is something that people care about.", "tokens": [51708, 400, 341, 307, 746, 300, 561, 1127, 466, 13, 51828], "temperature": 0.0, "avg_logprob": -0.16008812697633865, "compression_ratio": 1.636963696369637, "no_speech_prob": 0.00016611248429398984}, {"id": 740, "seek": 85348, "start": 3213.48, "end": 3214.8, "text": " I totally agree that's the play.", "tokens": [50364, 286, 3879, 3986, 300, 311, 264, 862, 13, 50430], "temperature": 0.0, "avg_logprob": -0.1968483828071855, "compression_ratio": 1.8442622950819672, "no_speech_prob": 0.000522047164849937}, {"id": 741, "seek": 85348, "start": 3214.8, "end": 3215.84, "text": " That's an identical play though,", "tokens": [50430, 663, 311, 364, 14800, 862, 1673, 11, 50482], "temperature": 0.0, "avg_logprob": -0.1968483828071855, "compression_ratio": 1.8442622950819672, "no_speech_prob": 0.000522047164849937}, {"id": 742, "seek": 85348, "start": 3215.84, "end": 3217.6, "text": " what I'm saying is to co here.", "tokens": [50482, 437, 286, 478, 1566, 307, 281, 598, 510, 13, 50570], "temperature": 0.0, "avg_logprob": -0.1968483828071855, "compression_ratio": 1.8442622950819672, "no_speech_prob": 0.000522047164849937}, {"id": 743, "seek": 85348, "start": 3217.6, "end": 3220.2799999999997, "text": " And it's and on that note, co here,", "tokens": [50570, 400, 309, 311, 293, 322, 300, 3637, 11, 598, 510, 11, 50704], "temperature": 0.0, "avg_logprob": -0.1968483828071855, "compression_ratio": 1.8442622950819672, "no_speech_prob": 0.000522047164849937}, {"id": 744, "seek": 85348, "start": 3220.2799999999997, "end": 3223.7200000000003, "text": " their play or largely what they have focused on", "tokens": [50704, 641, 862, 420, 11611, 437, 436, 362, 5178, 322, 50876], "temperature": 0.0, "avg_logprob": -0.1968483828071855, "compression_ratio": 1.8442622950819672, "no_speech_prob": 0.000522047164849937}, {"id": 745, "seek": 85348, "start": 3223.7200000000003, "end": 3227.16, "text": " is releasing models that they have trained", "tokens": [50876, 307, 16327, 5245, 300, 436, 362, 8895, 51048], "temperature": 0.0, "avg_logprob": -0.1968483828071855, "compression_ratio": 1.8442622950819672, "no_speech_prob": 0.000522047164849937}, {"id": 746, "seek": 85348, "start": 3227.16, "end": 3228.52, "text": " with the focus of enterprise.", "tokens": [51048, 365, 264, 1879, 295, 14132, 13, 51116], "temperature": 0.0, "avg_logprob": -0.1968483828071855, "compression_ratio": 1.8442622950819672, "no_speech_prob": 0.000522047164849937}, {"id": 747, "seek": 85348, "start": 3228.52, "end": 3230.52, "text": " So they release rag models, for instance,", "tokens": [51116, 407, 436, 4374, 17539, 5245, 11, 337, 5197, 11, 51216], "temperature": 0.0, "avg_logprob": -0.1968483828071855, "compression_ratio": 1.8442622950819672, "no_speech_prob": 0.000522047164849937}, {"id": 748, "seek": 85348, "start": 3230.52, "end": 3233.4, "text": " that they say are very good for enterprise use cases.", "tokens": [51216, 300, 436, 584, 366, 588, 665, 337, 14132, 764, 3331, 13, 51360], "temperature": 0.0, "avg_logprob": -0.1968483828071855, "compression_ratio": 1.8442622950819672, "no_speech_prob": 0.000522047164849937}, {"id": 749, "seek": 85348, "start": 3233.4, "end": 3236.96, "text": " They largely have position themselves", "tokens": [51360, 814, 11611, 362, 2535, 2969, 51538], "temperature": 0.0, "avg_logprob": -0.1968483828071855, "compression_ratio": 1.8442622950819672, "no_speech_prob": 0.000522047164849937}, {"id": 750, "seek": 85348, "start": 3236.96, "end": 3241.08, "text": " as having models and offerings that are useful for enterprise.", "tokens": [51538, 382, 1419, 5245, 293, 25898, 300, 366, 4420, 337, 14132, 13, 51744], "temperature": 0.0, "avg_logprob": -0.1968483828071855, "compression_ratio": 1.8442622950819672, "no_speech_prob": 0.000522047164849937}, {"id": 751, "seek": 88108, "start": 3241.08, "end": 3246.08, "text": " They do have a thing for building customized AI solutions.", "tokens": [50364, 814, 360, 362, 257, 551, 337, 2390, 30581, 7318, 6547, 13, 50614], "temperature": 0.0, "avg_logprob": -0.1649046039500204, "compression_ratio": 1.729641693811075, "no_speech_prob": 0.0001843275094870478}, {"id": 752, "seek": 88108, "start": 3246.88, "end": 3249.36, "text": " I don't know how much of it is apples to apples", "tokens": [50654, 286, 500, 380, 458, 577, 709, 295, 309, 307, 16814, 281, 16814, 50778], "temperature": 0.0, "avg_logprob": -0.1649046039500204, "compression_ratio": 1.729641693811075, "no_speech_prob": 0.0001843275094870478}, {"id": 753, "seek": 88108, "start": 3249.36, "end": 3251.64, "text": " with this new forge, whatever it is.", "tokens": [50778, 365, 341, 777, 38741, 11, 2035, 309, 307, 13, 50892], "temperature": 0.0, "avg_logprob": -0.1649046039500204, "compression_ratio": 1.729641693811075, "no_speech_prob": 0.0001843275094870478}, {"id": 754, "seek": 88108, "start": 3251.64, "end": 3252.48, "text": " That's kind of the thing, right?", "tokens": [50892, 663, 311, 733, 295, 264, 551, 11, 558, 30, 50934], "temperature": 0.0, "avg_logprob": -0.1649046039500204, "compression_ratio": 1.729641693811075, "no_speech_prob": 0.0001843275094870478}, {"id": 755, "seek": 88108, "start": 3252.48, "end": 3254.56, "text": " So yeah, co here's thing is called North.", "tokens": [50934, 407, 1338, 11, 598, 510, 311, 551, 307, 1219, 4067, 13, 51038], "temperature": 0.0, "avg_logprob": -0.1649046039500204, "compression_ratio": 1.729641693811075, "no_speech_prob": 0.0001843275094870478}, {"id": 756, "seek": 88108, "start": 3254.56, "end": 3256.52, "text": " It's this whole enterprise platform, right?", "tokens": [51038, 467, 311, 341, 1379, 14132, 3663, 11, 558, 30, 51136], "temperature": 0.0, "avg_logprob": -0.1649046039500204, "compression_ratio": 1.729641693811075, "no_speech_prob": 0.0001843275094870478}, {"id": 757, "seek": 88108, "start": 3256.52, "end": 3258.56, "text": " It's it's for AI agents and workflows", "tokens": [51136, 467, 311, 309, 311, 337, 7318, 12554, 293, 43461, 51238], "temperature": 0.0, "avg_logprob": -0.1649046039500204, "compression_ratio": 1.729641693811075, "no_speech_prob": 0.0001843275094870478}, {"id": 758, "seek": 88108, "start": 3258.56, "end": 3260.92, "text": " and then they've got a bunch of like re-rank and bed", "tokens": [51238, 293, 550, 436, 600, 658, 257, 3840, 295, 411, 319, 12, 20479, 293, 2901, 51356], "temperature": 0.0, "avg_logprob": -0.1649046039500204, "compression_ratio": 1.729641693811075, "no_speech_prob": 0.0001843275094870478}, {"id": 759, "seek": 88108, "start": 3260.92, "end": 3262.8, "text": " and a bunch of other things for rag", "tokens": [51356, 293, 257, 3840, 295, 661, 721, 337, 17539, 51450], "temperature": 0.0, "avg_logprob": -0.1649046039500204, "compression_ratio": 1.729641693811075, "no_speech_prob": 0.0001843275094870478}, {"id": 760, "seek": 88108, "start": 3262.8, "end": 3265.08, "text": " that are meant to be enterprise kind of correlated.", "tokens": [51450, 300, 366, 4140, 281, 312, 14132, 733, 295, 38574, 13, 51564], "temperature": 0.0, "avg_logprob": -0.1649046039500204, "compression_ratio": 1.729641693811075, "no_speech_prob": 0.0001843275094870478}, {"id": 761, "seek": 88108, "start": 3265.08, "end": 3266.56, "text": " But that's kind of the challenge.", "tokens": [51564, 583, 300, 311, 733, 295, 264, 3430, 13, 51638], "temperature": 0.0, "avg_logprob": -0.1649046039500204, "compression_ratio": 1.729641693811075, "no_speech_prob": 0.0001843275094870478}, {"id": 762, "seek": 88108, "start": 3266.56, "end": 3270.44, "text": " It's like ultimately, you're already seeing this appeal", "tokens": [51638, 467, 311, 411, 6284, 11, 291, 434, 1217, 2577, 341, 13668, 51832], "temperature": 0.0, "avg_logprob": -0.1649046039500204, "compression_ratio": 1.729641693811075, "no_speech_prob": 0.0001843275094870478}, {"id": 763, "seek": 91044, "start": 3270.44, "end": 3272.88, "text": " happening quite quickly to nationalism.", "tokens": [50364, 2737, 1596, 2661, 281, 39186, 13, 50486], "temperature": 0.0, "avg_logprob": -0.16279563587158918, "compression_ratio": 1.6886792452830188, "no_speech_prob": 0.0006548445671796799}, {"id": 764, "seek": 91044, "start": 3272.88, "end": 3276.4, "text": " I mean, Mistral is wrapping themselves in the French flag.", "tokens": [50486, 286, 914, 11, 20166, 2155, 307, 21993, 2969, 294, 264, 5522, 7166, 13, 50662], "temperature": 0.0, "avg_logprob": -0.16279563587158918, "compression_ratio": 1.6886792452830188, "no_speech_prob": 0.0006548445671796799}, {"id": 765, "seek": 91044, "start": 3276.4, "end": 3278.12, "text": " You see, co here doing similar things", "tokens": [50662, 509, 536, 11, 598, 510, 884, 2531, 721, 50748], "temperature": 0.0, "avg_logprob": -0.16279563587158918, "compression_ratio": 1.6886792452830188, "no_speech_prob": 0.0006548445671796799}, {"id": 766, "seek": 91044, "start": 3278.12, "end": 3279.44, "text": " with the Canadian flag.", "tokens": [50748, 365, 264, 12641, 7166, 13, 50814], "temperature": 0.0, "avg_logprob": -0.16279563587158918, "compression_ratio": 1.6886792452830188, "no_speech_prob": 0.0006548445671796799}, {"id": 767, "seek": 91044, "start": 3279.44, "end": 3281.44, "text": " Ultimately, these are signs of like companies", "tokens": [50814, 23921, 11, 613, 366, 7880, 295, 411, 3431, 50914], "temperature": 0.0, "avg_logprob": -0.16279563587158918, "compression_ratio": 1.6886792452830188, "no_speech_prob": 0.0006548445671796799}, {"id": 768, "seek": 91044, "start": 3281.44, "end": 3283.4, "text": " trying to look for some source of alpha", "tokens": [50914, 1382, 281, 574, 337, 512, 4009, 295, 8961, 51012], "temperature": 0.0, "avg_logprob": -0.16279563587158918, "compression_ratio": 1.6886792452830188, "no_speech_prob": 0.0006548445671796799}, {"id": 769, "seek": 91044, "start": 3283.4, "end": 3285.32, "text": " that's outside of the market.", "tokens": [51012, 300, 311, 2380, 295, 264, 2142, 13, 51108], "temperature": 0.0, "avg_logprob": -0.16279563587158918, "compression_ratio": 1.6886792452830188, "no_speech_prob": 0.0006548445671796799}, {"id": 770, "seek": 91044, "start": 3285.32, "end": 3288.32, "text": " I'm not saying that OpenAI isn't doing similar stuff", "tokens": [51108, 286, 478, 406, 1566, 300, 7238, 48698, 1943, 380, 884, 2531, 1507, 51258], "temperature": 0.0, "avg_logprob": -0.16279563587158918, "compression_ratio": 1.6886792452830188, "no_speech_prob": 0.0006548445671796799}, {"id": 771, "seek": 91044, "start": 3288.32, "end": 3290.56, "text": " and throbbig certainly struggling to do similar stuff", "tokens": [51258, 293, 739, 996, 65, 328, 3297, 9314, 281, 360, 2531, 1507, 51370], "temperature": 0.0, "avg_logprob": -0.16279563587158918, "compression_ratio": 1.6886792452830188, "no_speech_prob": 0.0006548445671796799}, {"id": 772, "seek": 91044, "start": 3290.56, "end": 3291.6400000000003, "text": " and still succeeding.", "tokens": [51370, 293, 920, 47912, 13, 51424], "temperature": 0.0, "avg_logprob": -0.16279563587158918, "compression_ratio": 1.6886792452830188, "no_speech_prob": 0.0006548445671796799}, {"id": 773, "seek": 91044, "start": 3291.6400000000003, "end": 3294.7200000000003, "text": " And so this is kind of part of the flavor to me", "tokens": [51424, 400, 370, 341, 307, 733, 295, 644, 295, 264, 6813, 281, 385, 51578], "temperature": 0.0, "avg_logprob": -0.16279563587158918, "compression_ratio": 1.6886792452830188, "no_speech_prob": 0.0006548445671796799}, {"id": 774, "seek": 91044, "start": 3294.7200000000003, "end": 3296.8, "text": " that like it starts to look a bit like a crutch", "tokens": [51578, 300, 411, 309, 3719, 281, 574, 257, 857, 411, 257, 941, 9349, 51682], "temperature": 0.0, "avg_logprob": -0.16279563587158918, "compression_ratio": 1.6886792452830188, "no_speech_prob": 0.0006548445671796799}, {"id": 775, "seek": 91044, "start": 3296.8, "end": 3298.44, "text": " and there's only so much, you know,", "tokens": [51682, 293, 456, 311, 787, 370, 709, 11, 291, 458, 11, 51764], "temperature": 0.0, "avg_logprob": -0.16279563587158918, "compression_ratio": 1.6886792452830188, "no_speech_prob": 0.0006548445671796799}, {"id": 776, "seek": 93844, "start": 3298.44, "end": 3301.04, "text": " a capex spend that can be subsidized by governments", "tokens": [50364, 257, 1335, 29420, 3496, 300, 393, 312, 20051, 1602, 538, 11280, 50494], "temperature": 0.0, "avg_logprob": -0.1782749811364286, "compression_ratio": 1.707774798927614, "no_speech_prob": 0.0578882209956646}, {"id": 777, "seek": 93844, "start": 3301.04, "end": 3304.08, "text": " that at a certain scale that I think we're approaching already,", "tokens": [50494, 300, 412, 257, 1629, 4373, 300, 286, 519, 321, 434, 14908, 1217, 11, 50646], "temperature": 0.0, "avg_logprob": -0.1782749811364286, "compression_ratio": 1.707774798927614, "no_speech_prob": 0.0578882209956646}, {"id": 778, "seek": 93844, "start": 3304.08, "end": 3305.84, "text": " it makes it hard to compete.", "tokens": [50646, 309, 1669, 309, 1152, 281, 11831, 13, 50734], "temperature": 0.0, "avg_logprob": -0.1782749811364286, "compression_ratio": 1.707774798927614, "no_speech_prob": 0.0578882209956646}, {"id": 779, "seek": 93844, "start": 3305.84, "end": 3307.24, "text": " When your big differentiator is like,", "tokens": [50734, 1133, 428, 955, 27372, 1639, 307, 411, 11, 50804], "temperature": 0.0, "avg_logprob": -0.1782749811364286, "compression_ratio": 1.707774798927614, "no_speech_prob": 0.0578882209956646}, {"id": 780, "seek": 93844, "start": 3307.24, "end": 3309.7200000000003, "text": " look how French we are or look how Canadian we are,", "tokens": [50804, 574, 577, 5522, 321, 366, 420, 574, 577, 12641, 321, 366, 11, 50928], "temperature": 0.0, "avg_logprob": -0.1782749811364286, "compression_ratio": 1.707774798927614, "no_speech_prob": 0.0578882209956646}, {"id": 781, "seek": 93844, "start": 3309.7200000000003, "end": 3311.6800000000003, "text": " that is a bit of a character for sure.", "tokens": [50928, 300, 307, 257, 857, 295, 257, 2517, 337, 988, 13, 51026], "temperature": 0.0, "avg_logprob": -0.1782749811364286, "compression_ratio": 1.707774798927614, "no_speech_prob": 0.0578882209956646}, {"id": 782, "seek": 93844, "start": 3311.6800000000003, "end": 3313.8, "text": " But it's an important part of the pitch here.", "tokens": [51026, 583, 309, 311, 364, 1021, 644, 295, 264, 7293, 510, 13, 51132], "temperature": 0.0, "avg_logprob": -0.1782749811364286, "compression_ratio": 1.707774798927614, "no_speech_prob": 0.0578882209956646}, {"id": 783, "seek": 93844, "start": 3313.8, "end": 3314.6400000000003, "text": " I worry about that.", "tokens": [51132, 286, 3292, 466, 300, 13, 51174], "temperature": 0.0, "avg_logprob": -0.1782749811364286, "compression_ratio": 1.707774798927614, "no_speech_prob": 0.0578882209956646}, {"id": 784, "seek": 93844, "start": 3314.6400000000003, "end": 3316.88, "text": " And that's kind of part of my bearishness here.", "tokens": [51174, 400, 300, 311, 733, 295, 644, 295, 452, 6155, 742, 1287, 510, 13, 51286], "temperature": 0.0, "avg_logprob": -0.1782749811364286, "compression_ratio": 1.707774798927614, "no_speech_prob": 0.0578882209956646}, {"id": 785, "seek": 93844, "start": 3316.88, "end": 3319.0, "text": " I'm just trying to be transparent about my reasoning here.", "tokens": [51286, 286, 478, 445, 1382, 281, 312, 12737, 466, 452, 21577, 510, 13, 51392], "temperature": 0.0, "avg_logprob": -0.1782749811364286, "compression_ratio": 1.707774798927614, "no_speech_prob": 0.0578882209956646}, {"id": 786, "seek": 93844, "start": 3319.0, "end": 3321.12, "text": " I could be wrong, but looking at the scale", "tokens": [51392, 286, 727, 312, 2085, 11, 457, 1237, 412, 264, 4373, 51498], "temperature": 0.0, "avg_logprob": -0.1782749811364286, "compression_ratio": 1.707774798927614, "no_speech_prob": 0.0578882209956646}, {"id": 787, "seek": 93844, "start": 3321.12, "end": 3322.7200000000003, "text": " of these companies, I mean, man, co here's been around", "tokens": [51498, 295, 613, 3431, 11, 286, 914, 11, 587, 11, 598, 510, 311, 668, 926, 51578], "temperature": 0.0, "avg_logprob": -0.1782749811364286, "compression_ratio": 1.707774798927614, "no_speech_prob": 0.0578882209956646}, {"id": 788, "seek": 93844, "start": 3322.7200000000003, "end": 3324.08, "text": " for longer than anthropic, right?", "tokens": [51578, 337, 2854, 813, 22727, 299, 11, 558, 30, 51646], "temperature": 0.0, "avg_logprob": -0.1782749811364286, "compression_ratio": 1.707774798927614, "no_speech_prob": 0.0578882209956646}, {"id": 789, "seek": 93844, "start": 3324.08, "end": 3328.28, "text": " Like they're, you know, what a percent of anthropic scale.", "tokens": [51646, 1743, 436, 434, 11, 291, 458, 11, 437, 257, 3043, 295, 22727, 299, 4373, 13, 51856], "temperature": 0.0, "avg_logprob": -0.1782749811364286, "compression_ratio": 1.707774798927614, "no_speech_prob": 0.0578882209956646}, {"id": 790, "seek": 96828, "start": 3328.4, "end": 3330.56, "text": " I think this is a sign that these markets are smaller", "tokens": [50370, 286, 519, 341, 307, 257, 1465, 300, 613, 8383, 366, 4356, 50478], "temperature": 0.0, "avg_logprob": -0.19895908248921235, "compression_ratio": 1.6209386281588447, "no_speech_prob": 0.0013639411190524697}, {"id": 791, "seek": 96828, "start": 3330.56, "end": 3332.8, "text": " than maybe had been hoped for initially.", "tokens": [50478, 813, 1310, 632, 668, 19737, 337, 9105, 13, 50590], "temperature": 0.0, "avg_logprob": -0.19895908248921235, "compression_ratio": 1.6209386281588447, "no_speech_prob": 0.0013639411190524697}, {"id": 792, "seek": 96828, "start": 3332.8, "end": 3333.64, "text": " We'll see.", "tokens": [50590, 492, 603, 536, 13, 50632], "temperature": 0.0, "avg_logprob": -0.19895908248921235, "compression_ratio": 1.6209386281588447, "no_speech_prob": 0.0013639411190524697}, {"id": 793, "seek": 96828, "start": 3333.64, "end": 3335.48, "text": " I mean, they could absolutely surprise me.", "tokens": [50632, 286, 914, 11, 436, 727, 3122, 6365, 385, 13, 50724], "temperature": 0.0, "avg_logprob": -0.19895908248921235, "compression_ratio": 1.6209386281588447, "no_speech_prob": 0.0013639411190524697}, {"id": 794, "seek": 96828, "start": 3335.48, "end": 3337.56, "text": " And I'm looking forward to seeing how it plays out.", "tokens": [50724, 400, 286, 478, 1237, 2128, 281, 2577, 577, 309, 5749, 484, 13, 50828], "temperature": 0.0, "avg_logprob": -0.19895908248921235, "compression_ratio": 1.6209386281588447, "no_speech_prob": 0.0013639411190524697}, {"id": 795, "seek": 96828, "start": 3337.56, "end": 3341.68, "text": " Yeah, I think longer term, I think this is an interesting time", "tokens": [50828, 865, 11, 286, 519, 2854, 1433, 11, 286, 519, 341, 307, 364, 1880, 565, 51034], "temperature": 0.0, "avg_logprob": -0.19895908248921235, "compression_ratio": 1.6209386281588447, "no_speech_prob": 0.0013639411190524697}, {"id": 796, "seek": 96828, "start": 3341.68, "end": 3346.04, "text": " to consider this direction of customized", "tokens": [51034, 281, 1949, 341, 3513, 295, 30581, 51252], "temperature": 0.0, "avg_logprob": -0.19895908248921235, "compression_ratio": 1.6209386281588447, "no_speech_prob": 0.0013639411190524697}, {"id": 797, "seek": 96828, "start": 3346.04, "end": 3350.2799999999997, "text": " and, you know, fine tune models for a given business", "tokens": [51252, 293, 11, 291, 458, 11, 2489, 10864, 5245, 337, 257, 2212, 1606, 51464], "temperature": 0.0, "avg_logprob": -0.19895908248921235, "compression_ratio": 1.6209386281588447, "no_speech_prob": 0.0013639411190524697}, {"id": 798, "seek": 96828, "start": 3350.2799999999997, "end": 3352.4, "text": " because the open source models,", "tokens": [51464, 570, 264, 1269, 4009, 5245, 11, 51570], "temperature": 0.0, "avg_logprob": -0.19895908248921235, "compression_ratio": 1.6209386281588447, "no_speech_prob": 0.0013639411190524697}, {"id": 799, "seek": 96828, "start": 3352.4, "end": 3354.56, "text": " the models that aren't perpetuated,", "tokens": [51570, 264, 5245, 300, 3212, 380, 16211, 27275, 11, 51678], "temperature": 0.0, "avg_logprob": -0.19895908248921235, "compression_ratio": 1.6209386281588447, "no_speech_prob": 0.0013639411190524697}, {"id": 800, "seek": 96828, "start": 3354.56, "end": 3356.36, "text": " I starting to get good.", "tokens": [51678, 286, 2891, 281, 483, 665, 13, 51768], "temperature": 0.0, "avg_logprob": -0.19895908248921235, "compression_ratio": 1.6209386281588447, "no_speech_prob": 0.0013639411190524697}, {"id": 801, "seek": 99636, "start": 3356.36, "end": 3359.64, "text": " And so you might start seeing a case where,", "tokens": [50364, 400, 370, 291, 1062, 722, 2577, 257, 1389, 689, 11, 50528], "temperature": 0.0, "avg_logprob": -0.16948023407082807, "compression_ratio": 1.5321888412017168, "no_speech_prob": 0.00038375906297005713}, {"id": 802, "seek": 99636, "start": 3359.64, "end": 3361.92, "text": " for instance, at Ostracade, right,", "tokens": [50528, 337, 5197, 11, 412, 422, 9733, 326, 762, 11, 558, 11, 50642], "temperature": 0.0, "avg_logprob": -0.16948023407082807, "compression_ratio": 1.5321888412017168, "no_speech_prob": 0.00038375906297005713}, {"id": 803, "seek": 99636, "start": 3361.92, "end": 3365.2, "text": " we have a pretty specific use case of coding", "tokens": [50642, 321, 362, 257, 1238, 2685, 764, 1389, 295, 17720, 50806], "temperature": 0.0, "avg_logprob": -0.16948023407082807, "compression_ratio": 1.5321888412017168, "no_speech_prob": 0.00038375906297005713}, {"id": 804, "seek": 99636, "start": 3365.2, "end": 3366.7200000000003, "text": " for these games, right?", "tokens": [50806, 337, 613, 2813, 11, 558, 30, 50882], "temperature": 0.0, "avg_logprob": -0.16948023407082807, "compression_ratio": 1.5321888412017168, "no_speech_prob": 0.00038375906297005713}, {"id": 805, "seek": 99636, "start": 3366.7200000000003, "end": 3371.92, "text": " And having coding agents with a specific set of technologies,", "tokens": [50882, 400, 1419, 17720, 12554, 365, 257, 2685, 992, 295, 7943, 11, 51142], "temperature": 0.0, "avg_logprob": -0.16948023407082807, "compression_ratio": 1.5321888412017168, "no_speech_prob": 0.00038375906297005713}, {"id": 806, "seek": 99636, "start": 3371.92, "end": 3374.52, "text": " HTML, JavaScript, et cetera.", "tokens": [51142, 17995, 11, 15778, 11, 1030, 11458, 13, 51272], "temperature": 0.0, "avg_logprob": -0.16948023407082807, "compression_ratio": 1.5321888412017168, "no_speech_prob": 0.00038375906297005713}, {"id": 807, "seek": 99636, "start": 3374.52, "end": 3379.6800000000003, "text": " So it is feasible that if you were to do reinforcement learning", "tokens": [51272, 407, 309, 307, 26648, 300, 498, 291, 645, 281, 360, 29280, 2539, 51530], "temperature": 0.0, "avg_logprob": -0.16948023407082807, "compression_ratio": 1.5321888412017168, "no_speech_prob": 0.00038375906297005713}, {"id": 808, "seek": 99636, "start": 3379.6800000000003, "end": 3384.36, "text": " and post-training on our own internal proprietary data", "tokens": [51530, 293, 2183, 12, 17227, 1760, 322, 527, 1065, 6920, 38992, 1412, 51764], "temperature": 0.0, "avg_logprob": -0.16948023407082807, "compression_ratio": 1.5321888412017168, "no_speech_prob": 0.00038375906297005713}, {"id": 809, "seek": 102436, "start": 3384.36, "end": 3386.6000000000004, "text": " of, you know, good games that people have made", "tokens": [50364, 295, 11, 291, 458, 11, 665, 2813, 300, 561, 362, 1027, 50476], "temperature": 0.0, "avg_logprob": -0.19519991391017788, "compression_ratio": 1.7176470588235293, "no_speech_prob": 0.023610234260559082}, {"id": 810, "seek": 102436, "start": 3386.6000000000004, "end": 3388.88, "text": " and chat histories and so on,", "tokens": [50476, 293, 5081, 30631, 293, 370, 322, 11, 50590], "temperature": 0.0, "avg_logprob": -0.19519991391017788, "compression_ratio": 1.7176470588235293, "no_speech_prob": 0.023610234260559082}, {"id": 811, "seek": 102436, "start": 3388.88, "end": 3392.04, "text": " we would get a better model and a better performance.", "tokens": [50590, 321, 576, 483, 257, 1101, 2316, 293, 257, 1101, 3389, 13, 50748], "temperature": 0.0, "avg_logprob": -0.19519991391017788, "compression_ratio": 1.7176470588235293, "no_speech_prob": 0.023610234260559082}, {"id": 812, "seek": 102436, "start": 3392.04, "end": 3393.6400000000003, "text": " That didn't used to be the case.", "tokens": [50748, 663, 994, 380, 1143, 281, 312, 264, 1389, 13, 50828], "temperature": 0.0, "avg_logprob": -0.19519991391017788, "compression_ratio": 1.7176470588235293, "no_speech_prob": 0.023610234260559082}, {"id": 813, "seek": 102436, "start": 3393.6400000000003, "end": 3397.92, "text": " It used to be that like the base models just weren't good enough,", "tokens": [50828, 467, 1143, 281, 312, 300, 411, 264, 3096, 5245, 445, 4999, 380, 665, 1547, 11, 51042], "temperature": 0.0, "avg_logprob": -0.19519991391017788, "compression_ratio": 1.7176470588235293, "no_speech_prob": 0.023610234260559082}, {"id": 814, "seek": 102436, "start": 3397.92, "end": 3402.76, "text": " just doing prompt engineering was going to be the best you can do.", "tokens": [51042, 445, 884, 12391, 7043, 390, 516, 281, 312, 264, 1151, 291, 393, 360, 13, 51284], "temperature": 0.0, "avg_logprob": -0.19519991391017788, "compression_ratio": 1.7176470588235293, "no_speech_prob": 0.023610234260559082}, {"id": 815, "seek": 102436, "start": 3402.76, "end": 3405.36, "text": " I could see it heading in a direction", "tokens": [51284, 286, 727, 536, 309, 9864, 294, 257, 3513, 51414], "temperature": 0.0, "avg_logprob": -0.19519991391017788, "compression_ratio": 1.7176470588235293, "no_speech_prob": 0.023610234260559082}, {"id": 816, "seek": 102436, "start": 3405.36, "end": 3409.08, "text": " and me straw man may not be the one that owns this.", "tokens": [51414, 293, 385, 10099, 587, 815, 406, 312, 264, 472, 300, 19143, 341, 13, 51600], "temperature": 0.0, "avg_logprob": -0.19519991391017788, "compression_ratio": 1.7176470588235293, "no_speech_prob": 0.023610234260559082}, {"id": 817, "seek": 102436, "start": 3409.08, "end": 3412.32, "text": " It could be an anthropic or openly I jump in on it.", "tokens": [51600, 467, 727, 312, 364, 22727, 299, 420, 23109, 286, 3012, 294, 322, 309, 13, 51762], "temperature": 0.0, "avg_logprob": -0.19519991391017788, "compression_ratio": 1.7176470588235293, "no_speech_prob": 0.023610234260559082}, {"id": 818, "seek": 105232, "start": 3412.3199999999997, "end": 3416.08, "text": " But regardless, having these sort of individualized,", "tokens": [50364, 583, 10060, 11, 1419, 613, 1333, 295, 2609, 1602, 11, 50552], "temperature": 0.0, "avg_logprob": -0.21969127091201576, "compression_ratio": 1.6293436293436294, "no_speech_prob": 0.0003095945285167545}, {"id": 819, "seek": 105232, "start": 3416.08, "end": 3420.3999999999996, "text": " customized models for different companies,", "tokens": [50552, 30581, 5245, 337, 819, 3431, 11, 50768], "temperature": 0.0, "avg_logprob": -0.21969127091201576, "compression_ratio": 1.6293436293436294, "no_speech_prob": 0.0003095945285167545}, {"id": 820, "seek": 105232, "start": 3420.3999999999996, "end": 3424.76, "text": " agent techniques seems like a very much feasible future.", "tokens": [50768, 9461, 7512, 2544, 411, 257, 588, 709, 26648, 2027, 13, 50986], "temperature": 0.0, "avg_logprob": -0.21969127091201576, "compression_ratio": 1.6293436293436294, "no_speech_prob": 0.0003095945285167545}, {"id": 821, "seek": 105232, "start": 3424.76, "end": 3427.2, "text": " Yeah, definitely not questioning the existence of a market, right?", "tokens": [50986, 865, 11, 2138, 406, 21257, 264, 9123, 295, 257, 2142, 11, 558, 30, 51108], "temperature": 0.0, "avg_logprob": -0.21969127091201576, "compression_ratio": 1.6293436293436294, "no_speech_prob": 0.0003095945285167545}, {"id": 822, "seek": 105232, "start": 3427.2, "end": 3429.92, "text": " I'm just sort of skeptical of their positioning, as you said,", "tokens": [51108, 286, 478, 445, 1333, 295, 28601, 295, 641, 26381, 11, 382, 291, 848, 11, 51244], "temperature": 0.0, "avg_logprob": -0.21969127091201576, "compression_ratio": 1.6293436293436294, "no_speech_prob": 0.0003095945285167545}, {"id": 823, "seek": 105232, "start": 3429.92, "end": 3432.08, "text": " to like, are they the ones to capture this?", "tokens": [51244, 281, 411, 11, 366, 436, 264, 2306, 281, 7983, 341, 30, 51352], "temperature": 0.0, "avg_logprob": -0.21969127091201576, "compression_ratio": 1.6293436293436294, "no_speech_prob": 0.0003095945285167545}, {"id": 824, "seek": 105232, "start": 3432.08, "end": 3433.64, "text": " Yeah, that's where I'm wearing.", "tokens": [51352, 865, 11, 300, 311, 689, 286, 478, 4769, 13, 51430], "temperature": 0.0, "avg_logprob": -0.21969127091201576, "compression_ratio": 1.6293436293436294, "no_speech_prob": 0.0003095945285167545}, {"id": 825, "seek": 105232, "start": 3433.64, "end": 3435.24, "text": " Yeah, it's something to try.", "tokens": [51430, 865, 11, 309, 311, 746, 281, 853, 13, 51510], "temperature": 0.0, "avg_logprob": -0.21969127091201576, "compression_ratio": 1.6293436293436294, "no_speech_prob": 0.0003095945285167545}, {"id": 826, "seek": 105232, "start": 3435.24, "end": 3437.92, "text": " So it's something to try, no doubt.", "tokens": [51510, 407, 309, 311, 746, 281, 853, 11, 572, 6385, 13, 51644], "temperature": 0.0, "avg_logprob": -0.21969127091201576, "compression_ratio": 1.6293436293436294, "no_speech_prob": 0.0003095945285167545}, {"id": 827, "seek": 107792, "start": 3437.92, "end": 3440.8, "text": " And back to chips,", "tokens": [50364, 400, 646, 281, 11583, 11, 50508], "temperature": 0.0, "avg_logprob": -0.27154083605165835, "compression_ratio": 1.5151515151515151, "no_speech_prob": 0.004009682219475508}, {"id": 828, "seek": 107792, "start": 3440.8, "end": 3444.84, "text": " China's buy dense gets access to top and video AI chips", "tokens": [50508, 3533, 311, 2256, 18011, 2170, 2105, 281, 1192, 293, 960, 7318, 11583, 50710], "temperature": 0.0, "avg_logprob": -0.27154083605165835, "compression_ratio": 1.5151515151515151, "no_speech_prob": 0.004009682219475508}, {"id": 829, "seek": 107792, "start": 3444.84, "end": 3448.5600000000004, "text": " according to the Washington Street Journal.", "tokens": [50710, 4650, 281, 264, 6149, 7638, 16936, 13, 50896], "temperature": 0.0, "avg_logprob": -0.27154083605165835, "compression_ratio": 1.5151515151515151, "no_speech_prob": 0.004009682219475508}, {"id": 830, "seek": 107792, "start": 3448.5600000000004, "end": 3452.96, "text": " So they are seeing that buy dense is assembling", "tokens": [50896, 407, 436, 366, 2577, 300, 2256, 18011, 307, 43867, 51116], "temperature": 0.0, "avg_logprob": -0.27154083605165835, "compression_ratio": 1.5151515151515151, "no_speech_prob": 0.004009682219475508}, {"id": 831, "seek": 107792, "start": 3452.96, "end": 3458.04, "text": " significant computing power outside of China using these chips.", "tokens": [51116, 4776, 15866, 1347, 2380, 295, 3533, 1228, 613, 11583, 13, 51370], "temperature": 0.0, "avg_logprob": -0.27154083605165835, "compression_ratio": 1.5151515151515151, "no_speech_prob": 0.004009682219475508}, {"id": 832, "seek": 107792, "start": 3458.04, "end": 3462.2799999999997, "text": " They are supposedly working with the South Feast Asian firm,", "tokens": [51370, 814, 366, 20581, 1364, 365, 264, 4242, 3697, 525, 10645, 6174, 11, 51582], "temperature": 0.0, "avg_logprob": -0.27154083605165835, "compression_ratio": 1.5151515151515151, "no_speech_prob": 0.004009682219475508}, {"id": 833, "seek": 107792, "start": 3462.2799999999997, "end": 3466.84, "text": " our London cloud and they plan to deploy approximately 500", "tokens": [51582, 527, 7042, 4588, 293, 436, 1393, 281, 7274, 10447, 5923, 51810], "temperature": 0.0, "avg_logprob": -0.27154083605165835, "compression_ratio": 1.5151515151515151, "no_speech_prob": 0.004009682219475508}, {"id": 834, "seek": 110684, "start": 3466.84, "end": 3471.0, "text": " and video black wall computing systems in Malaysia,", "tokens": [50364, 293, 960, 2211, 2929, 15866, 3652, 294, 25465, 11, 50572], "temperature": 0.0, "avg_logprob": -0.2153942906993559, "compression_ratio": 1.5154185022026432, "no_speech_prob": 0.0006551701226271689}, {"id": 835, "seek": 110684, "start": 3471.0, "end": 3475.24, "text": " totaling 36,000 B to 100 chips.", "tokens": [50572, 3217, 278, 8652, 11, 1360, 363, 281, 2319, 11583, 13, 50784], "temperature": 0.0, "avg_logprob": -0.2153942906993559, "compression_ratio": 1.5154185022026432, "no_speech_prob": 0.0006551701226271689}, {"id": 836, "seek": 110684, "start": 3475.24, "end": 3479.2, "text": " So that's 500 computing systems, presumably meaning racks", "tokens": [50784, 407, 300, 311, 5923, 15866, 3652, 11, 26742, 3620, 47063, 50982], "temperature": 0.0, "avg_logprob": -0.2153942906993559, "compression_ratio": 1.5154185022026432, "no_speech_prob": 0.0006551701226271689}, {"id": 837, "seek": 110684, "start": 3479.2, "end": 3483.08, "text": " that amounts to tens of thousands of actual GPUs.", "tokens": [50982, 300, 11663, 281, 10688, 295, 5383, 295, 3539, 18407, 82, 13, 51176], "temperature": 0.0, "avg_logprob": -0.2153942906993559, "compression_ratio": 1.5154185022026432, "no_speech_prob": 0.0006551701226271689}, {"id": 838, "seek": 110684, "start": 3483.08, "end": 3486.12, "text": " So yeah, it's suppose giving us an indication", "tokens": [51176, 407, 1338, 11, 309, 311, 7297, 2902, 505, 364, 18877, 51328], "temperature": 0.0, "avg_logprob": -0.2153942906993559, "compression_ratio": 1.5154185022026432, "no_speech_prob": 0.0006551701226271689}, {"id": 839, "seek": 110684, "start": 3486.12, "end": 3489.3199999999997, "text": " that the company by dense being a massive company", "tokens": [51328, 300, 264, 2237, 538, 18011, 885, 257, 5994, 2237, 51488], "temperature": 0.0, "avg_logprob": -0.2153942906993559, "compression_ratio": 1.5154185022026432, "no_speech_prob": 0.0006551701226271689}, {"id": 840, "seek": 110684, "start": 3489.3199999999997, "end": 3493.68, "text": " that has their products used outside of China, no doubt,", "tokens": [51488, 300, 575, 641, 3383, 1143, 2380, 295, 3533, 11, 572, 6385, 11, 51706], "temperature": 0.0, "avg_logprob": -0.2153942906993559, "compression_ratio": 1.5154185022026432, "no_speech_prob": 0.0006551701226271689}, {"id": 841, "seek": 113368, "start": 3493.6800000000003, "end": 3497.6800000000003, "text": " is able to make use of your chips in other countries.", "tokens": [50364, 307, 1075, 281, 652, 764, 295, 428, 11583, 294, 661, 3517, 13, 50564], "temperature": 0.0, "avg_logprob": -0.16557836236844536, "compression_ratio": 1.6486486486486487, "no_speech_prob": 0.00629326980561018}, {"id": 842, "seek": 113368, "start": 3497.6800000000003, "end": 3499.44, "text": " Yeah, yeah, and you know, if you're looking at this", "tokens": [50564, 865, 11, 1338, 11, 293, 291, 458, 11, 498, 291, 434, 1237, 412, 341, 50652], "temperature": 0.0, "avg_logprob": -0.16557836236844536, "compression_ratio": 1.6486486486486487, "no_speech_prob": 0.00629326980561018}, {"id": 843, "seek": 113368, "start": 3499.44, "end": 3501.84, "text": " being like, hey, wait a minute, I thought we weren't shipping.", "tokens": [50652, 885, 411, 11, 4177, 11, 1699, 257, 3456, 11, 286, 1194, 321, 4999, 380, 14122, 13, 50772], "temperature": 0.0, "avg_logprob": -0.16557836236844536, "compression_ratio": 1.6486486486486487, "no_speech_prob": 0.00629326980561018}, {"id": 844, "seek": 113368, "start": 3501.84, "end": 3504.4, "text": " I thought we just greenlit the H200 shipping.", "tokens": [50772, 286, 1194, 321, 445, 3092, 23062, 264, 389, 7629, 14122, 13, 50900], "temperature": 0.0, "avg_logprob": -0.16557836236844536, "compression_ratio": 1.6486486486486487, "no_speech_prob": 0.00629326980561018}, {"id": 845, "seek": 113368, "start": 3504.4, "end": 3506.8, "text": " What are we doing shipping black wells to China?", "tokens": [50900, 708, 366, 321, 884, 14122, 2211, 30984, 281, 3533, 30, 51020], "temperature": 0.0, "avg_logprob": -0.16557836236844536, "compression_ratio": 1.6486486486486487, "no_speech_prob": 0.00629326980561018}, {"id": 846, "seek": 113368, "start": 3506.8, "end": 3507.92, "text": " Well, that's the whole point.", "tokens": [51020, 1042, 11, 300, 311, 264, 1379, 935, 13, 51076], "temperature": 0.0, "avg_logprob": -0.16557836236844536, "compression_ratio": 1.6486486486486487, "no_speech_prob": 0.00629326980561018}, {"id": 847, "seek": 113368, "start": 3507.92, "end": 3509.76, "text": " It's technically not being shipped to China.", "tokens": [51076, 467, 311, 12120, 406, 885, 25312, 281, 3533, 13, 51168], "temperature": 0.0, "avg_logprob": -0.16557836236844536, "compression_ratio": 1.6486486486486487, "no_speech_prob": 0.00629326980561018}, {"id": 848, "seek": 113368, "start": 3509.76, "end": 3514.08, "text": " The export controls that apply here from 2023 regulate", "tokens": [51168, 440, 10725, 9003, 300, 3079, 510, 490, 44377, 24475, 51384], "temperature": 0.0, "avg_logprob": -0.16557836236844536, "compression_ratio": 1.6486486486486487, "no_speech_prob": 0.00629326980561018}, {"id": 849, "seek": 113368, "start": 3514.08, "end": 3519.08, "text": " where the hardware is shipped, not where the compute is used.", "tokens": [51384, 689, 264, 8837, 307, 25312, 11, 406, 689, 264, 14722, 307, 1143, 13, 51634], "temperature": 0.0, "avg_logprob": -0.16557836236844536, "compression_ratio": 1.6486486486486487, "no_speech_prob": 0.00629326980561018}, {"id": 850, "seek": 113368, "start": 3519.24, "end": 3520.4, "text": " Right? And that was intentional.", "tokens": [51642, 1779, 30, 400, 300, 390, 21935, 13, 51700], "temperature": 0.0, "avg_logprob": -0.16557836236844536, "compression_ratio": 1.6486486486486487, "no_speech_prob": 0.00629326980561018}, {"id": 851, "seek": 116040, "start": 3520.4, "end": 3524.28, "text": " It was meant to allow the sort of like global cloud infrastructure", "tokens": [50364, 467, 390, 4140, 281, 2089, 264, 1333, 295, 411, 4338, 4588, 6896, 50558], "temperature": 0.0, "avg_logprob": -0.1502607588295464, "compression_ratio": 1.6491228070175439, "no_speech_prob": 0.0024473711382597685}, {"id": 852, "seek": 116040, "start": 3524.28, "end": 3526.7200000000003, "text": " to be built based on American hardware.", "tokens": [50558, 281, 312, 3094, 2361, 322, 2665, 8837, 13, 50680], "temperature": 0.0, "avg_logprob": -0.1502607588295464, "compression_ratio": 1.6491228070175439, "no_speech_prob": 0.0024473711382597685}, {"id": 853, "seek": 116040, "start": 3526.7200000000003, "end": 3530.8, "text": " But the thing is, by dense is not on the famous entity list", "tokens": [50680, 583, 264, 551, 307, 11, 538, 18011, 307, 406, 322, 264, 4618, 13977, 1329, 50884], "temperature": 0.0, "avg_logprob": -0.1502607588295464, "compression_ratio": 1.6491228070175439, "no_speech_prob": 0.0024473711382597685}, {"id": 854, "seek": 116040, "start": 3530.8, "end": 3532.8, "text": " or there's a military end useless as well.", "tokens": [50884, 420, 456, 311, 257, 4632, 917, 14115, 382, 731, 13, 50984], "temperature": 0.0, "avg_logprob": -0.1502607588295464, "compression_ratio": 1.6491228070175439, "no_speech_prob": 0.0024473711382597685}, {"id": 855, "seek": 116040, "start": 3532.8, "end": 3535.36, "text": " So the fact they're using Nvidia hardware", "tokens": [50984, 407, 264, 1186, 436, 434, 1228, 46284, 8837, 51112], "temperature": 0.0, "avg_logprob": -0.1502607588295464, "compression_ratio": 1.6491228070175439, "no_speech_prob": 0.0024473711382597685}, {"id": 856, "seek": 116040, "start": 3535.36, "end": 3538.6400000000003, "text": " does not automatically in and of itself trigger restrictions.", "tokens": [51112, 775, 406, 6772, 294, 293, 295, 2564, 7875, 14191, 13, 51276], "temperature": 0.0, "avg_logprob": -0.1502607588295464, "compression_ratio": 1.6491228070175439, "no_speech_prob": 0.0024473711382597685}, {"id": 857, "seek": 116040, "start": 3538.6400000000003, "end": 3542.7200000000003, "text": " The fact is that they're just setting up shop outside of China", "tokens": [51276, 440, 1186, 307, 300, 436, 434, 445, 3287, 493, 3945, 2380, 295, 3533, 51480], "temperature": 0.0, "avg_logprob": -0.1502607588295464, "compression_ratio": 1.6491228070175439, "no_speech_prob": 0.0024473711382597685}, {"id": 858, "seek": 116040, "start": 3542.7200000000003, "end": 3544.36, "text": " and normally that's totally okay.", "tokens": [51480, 293, 5646, 300, 311, 3879, 1392, 13, 51562], "temperature": 0.0, "avg_logprob": -0.1502607588295464, "compression_ratio": 1.6491228070175439, "no_speech_prob": 0.0024473711382597685}, {"id": 859, "seek": 116040, "start": 3544.36, "end": 3547.32, "text": " And Nvidia actually confirmed there were no objections here", "tokens": [51562, 400, 46284, 767, 11341, 456, 645, 572, 44649, 510, 51710], "temperature": 0.0, "avg_logprob": -0.1502607588295464, "compression_ratio": 1.6491228070175439, "no_speech_prob": 0.0024473711382597685}, {"id": 860, "seek": 118732, "start": 3547.3199999999997, "end": 3549.2, "text": " and the BIS Department of Commerce", "tokens": [50364, 293, 264, 363, 2343, 5982, 295, 34493, 50458], "temperature": 0.0, "avg_logprob": -0.22203848451033406, "compression_ratio": 1.4371859296482412, "no_speech_prob": 0.07000327855348587}, {"id": 861, "seek": 118732, "start": 3549.2, "end": 3550.48, "text": " apparently is also on board.", "tokens": [50458, 7970, 307, 611, 322, 3150, 13, 50522], "temperature": 0.0, "avg_logprob": -0.22203848451033406, "compression_ratio": 1.4371859296482412, "no_speech_prob": 0.07000327855348587}, {"id": 862, "seek": 0, "start": 3550.54, "end": 3551.78, "text": " So they're all tracking.", "tokens": [50891, 407, 436, 434, 439, 11603, 13, 50953], "temperature": 0.0, "avg_logprob": -0.23775929194724488, "compression_ratio": 1.5636856368563685, "no_speech_prob": 0.1154668852686882}, {"id": 863, "seek": 0, "start": 3551.78, "end": 3553.38, "text": " Still, I mean, this kind of reveals like,", "tokens": [50953, 8291, 11, 286, 914, 11, 341, 733, 295, 20893, 411, 11, 51033], "temperature": 0.0, "avg_logprob": -0.23775929194724488, "compression_ratio": 1.5636856368563685, "no_speech_prob": 0.1154668852686882}, {"id": 864, "seek": 0, "start": 3553.38, "end": 3555.26, "text": " hey, well, this is what you're opening yourself up to.", "tokens": [51033, 4177, 11, 731, 11, 341, 307, 437, 291, 434, 5193, 1803, 493, 281, 13, 51127], "temperature": 0.0, "avg_logprob": -0.23775929194724488, "compression_ratio": 1.5636856368563685, "no_speech_prob": 0.1154668852686882}, {"id": 865, "seek": 0, "start": 3555.26, "end": 3556.26, "text": " If you think about the scale,", "tokens": [51127, 759, 291, 519, 466, 264, 4373, 11, 51177], "temperature": 0.0, "avg_logprob": -0.23775929194724488, "compression_ratio": 1.5636856368563685, "no_speech_prob": 0.1154668852686882}, {"id": 866, "seek": 0, "start": 3556.26, "end": 3557.5, "text": " it's by the way for a second.", "tokens": [51177, 309, 311, 538, 264, 636, 337, 257, 1150, 13, 51239], "temperature": 0.0, "avg_logprob": -0.23775929194724488, "compression_ratio": 1.5636856368563685, "no_speech_prob": 0.1154668852686882}, {"id": 867, "seek": 0, "start": 3557.5, "end": 3561.26, "text": " What does it mean to have 36,000 B200 GPUs,", "tokens": [51239, 708, 775, 309, 914, 281, 362, 8652, 11, 1360, 363, 7629, 18407, 82, 11, 51427], "temperature": 0.0, "avg_logprob": -0.23775929194724488, "compression_ratio": 1.5636856368563685, "no_speech_prob": 0.1154668852686882}, {"id": 868, "seek": 0, "start": 3561.26, "end": 3563.34, "text": " you know, 500 of these NVL72 units?", "tokens": [51427, 291, 458, 11, 5923, 295, 613, 426, 53, 43, 28890, 6815, 30, 51531], "temperature": 0.0, "avg_logprob": -0.23775929194724488, "compression_ratio": 1.5636856368563685, "no_speech_prob": 0.1154668852686882}, {"id": 869, "seek": 0, "start": 3563.34, "end": 3564.94, "text": " So this is typically a two rack system.", "tokens": [51531, 407, 341, 307, 5850, 257, 732, 14788, 1185, 13, 51611], "temperature": 0.0, "avg_logprob": -0.23775929194724488, "compression_ratio": 1.5636856368563685, "no_speech_prob": 0.1154668852686882}, {"id": 870, "seek": 0, "start": 3564.94, "end": 3567.42, "text": " Yeah, I think we talked about it in the hardware episode at one point.", "tokens": [51611, 865, 11, 286, 519, 321, 2825, 466, 309, 294, 264, 8837, 3500, 412, 472, 935, 13, 51735], "temperature": 0.0, "avg_logprob": -0.23775929194724488, "compression_ratio": 1.5636856368563685, "no_speech_prob": 0.1154668852686882}, {"id": 871, "seek": 2742, "start": 3567.42, "end": 3572.02, "text": " So this many GPUs adds up to about a 60 megawatt cluster.", "tokens": [50364, 407, 341, 867, 18407, 82, 10860, 493, 281, 466, 257, 4060, 10816, 1607, 1591, 13630, 13, 50594], "temperature": 0.0, "avg_logprob": -0.17855031817567116, "compression_ratio": 1.6864686468646866, "no_speech_prob": 0.000419285090174526}, {"id": 872, "seek": 2742, "start": 3572.02, "end": 3573.66, "text": " Okay, so 60 megawatts of power.", "tokens": [50594, 1033, 11, 370, 4060, 10816, 38036, 1373, 295, 1347, 13, 50676], "temperature": 0.0, "avg_logprob": -0.17855031817567116, "compression_ratio": 1.6864686468646866, "no_speech_prob": 0.000419285090174526}, {"id": 873, "seek": 2742, "start": 3573.66, "end": 3576.5, "text": " That's enough to power roughly like 60,000 homes.", "tokens": [50676, 663, 311, 1547, 281, 1347, 9810, 411, 4060, 11, 1360, 7388, 13, 50818], "temperature": 0.0, "avg_logprob": -0.17855031817567116, "compression_ratio": 1.6864686468646866, "no_speech_prob": 0.000419285090174526}, {"id": 874, "seek": 2742, "start": 3576.5, "end": 3577.54, "text": " You can think of it that way.", "tokens": [50818, 509, 393, 519, 295, 309, 300, 636, 13, 50870], "temperature": 0.0, "avg_logprob": -0.17855031817567116, "compression_ratio": 1.6864686468646866, "no_speech_prob": 0.000419285090174526}, {"id": 875, "seek": 2742, "start": 3577.54, "end": 3580.22, "text": " So it's like a little small town of compute.", "tokens": [50870, 407, 309, 311, 411, 257, 707, 1359, 3954, 295, 14722, 13, 51004], "temperature": 0.0, "avg_logprob": -0.17855031817567116, "compression_ratio": 1.6864686468646866, "no_speech_prob": 0.000419285090174526}, {"id": 876, "seek": 2742, "start": 3580.22, "end": 3584.22, "text": " Compare that to Microsoft Abelene is like 100 megawatts", "tokens": [51004, 48523, 300, 281, 8116, 2847, 338, 1450, 307, 411, 2319, 10816, 38036, 1373, 51204], "temperature": 0.0, "avg_logprob": -0.17855031817567116, "compression_ratio": 1.6864686468646866, "no_speech_prob": 0.000419285090174526}, {"id": 877, "seek": 2742, "start": 3584.22, "end": 3586.46, "text": " or Elon's XAI Texas Colossus site.", "tokens": [51204, 420, 28498, 311, 1783, 48698, 7885, 4004, 772, 301, 3621, 13, 51316], "temperature": 0.0, "avg_logprob": -0.17855031817567116, "compression_ratio": 1.6864686468646866, "no_speech_prob": 0.000419285090174526}, {"id": 878, "seek": 2742, "start": 3586.46, "end": 3588.78, "text": " That one's more like 130 megawatts.", "tokens": [51316, 663, 472, 311, 544, 411, 19966, 10816, 38036, 1373, 13, 51432], "temperature": 0.0, "avg_logprob": -0.17855031817567116, "compression_ratio": 1.6864686468646866, "no_speech_prob": 0.000419285090174526}, {"id": 879, "seek": 2742, "start": 3588.78, "end": 3590.74, "text": " The Abelene comparison has been unfair", "tokens": [51432, 440, 2847, 338, 1450, 9660, 575, 668, 17019, 51530], "temperature": 0.0, "avg_logprob": -0.17855031817567116, "compression_ratio": 1.6864686468646866, "no_speech_prob": 0.000419285090174526}, {"id": 880, "seek": 2742, "start": 3590.74, "end": 3592.54, "text": " because that's still kind of being built out.", "tokens": [51530, 570, 300, 311, 920, 733, 295, 885, 3094, 484, 13, 51620], "temperature": 0.0, "avg_logprob": -0.17855031817567116, "compression_ratio": 1.6864686468646866, "no_speech_prob": 0.000419285090174526}, {"id": 881, "seek": 2742, "start": 3592.54, "end": 3594.82, "text": " But the Colossus is better apples apples", "tokens": [51620, 583, 264, 4004, 772, 301, 307, 1101, 16814, 16814, 51734], "temperature": 0.0, "avg_logprob": -0.17855031817567116, "compression_ratio": 1.6864686468646866, "no_speech_prob": 0.000419285090174526}, {"id": 882, "seek": 2742, "start": 3594.82, "end": 3596.38, "text": " because like they're still building it out,", "tokens": [51734, 570, 411, 436, 434, 920, 2390, 309, 484, 11, 51812], "temperature": 0.0, "avg_logprob": -0.17855031817567116, "compression_ratio": 1.6864686468646866, "no_speech_prob": 0.000419285090174526}, {"id": 883, "seek": 5638, "start": 3596.38, "end": 3599.06, "text": " single-purpose, you know, it's stood up, you were fast.", "tokens": [50364, 2167, 12, 42601, 11, 291, 458, 11, 309, 311, 9371, 493, 11, 291, 645, 2370, 13, 50498], "temperature": 0.0, "avg_logprob": -0.16911016583022936, "compression_ratio": 1.7533333333333334, "no_speech_prob": 0.0015675356844440103}, {"id": 884, "seek": 5638, "start": 3599.06, "end": 3601.82, "text": " What this is showing is you're allowing a Chinese company", "tokens": [50498, 708, 341, 307, 4099, 307, 291, 434, 8293, 257, 4649, 2237, 50636], "temperature": 0.0, "avg_logprob": -0.16911016583022936, "compression_ratio": 1.7533333333333334, "no_speech_prob": 0.0015675356844440103}, {"id": 885, "seek": 5638, "start": 3601.82, "end": 3603.22, "text": " to get up to the same at least", "tokens": [50636, 281, 483, 493, 281, 264, 912, 412, 1935, 50706], "temperature": 0.0, "avg_logprob": -0.16911016583022936, "compression_ratio": 1.7533333333333334, "no_speech_prob": 0.0015675356844440103}, {"id": 886, "seek": 5638, "start": 3603.22, "end": 3605.26, "text": " order of magnitude in compute.", "tokens": [50706, 1668, 295, 15668, 294, 14722, 13, 50808], "temperature": 0.0, "avg_logprob": -0.16911016583022936, "compression_ratio": 1.7533333333333334, "no_speech_prob": 0.0015675356844440103}, {"id": 887, "seek": 5638, "start": 3605.26, "end": 3607.46, "text": " It'll admittedly be online kind of like, you know,", "tokens": [50808, 467, 603, 14920, 356, 312, 2950, 733, 295, 411, 11, 291, 458, 11, 50918], "temperature": 0.0, "avg_logprob": -0.16911016583022936, "compression_ratio": 1.7533333333333334, "no_speech_prob": 0.0015675356844440103}, {"id": 888, "seek": 5638, "start": 3607.46, "end": 3610.14, "text": " later on this year, but that's still something.", "tokens": [50918, 1780, 322, 341, 1064, 11, 457, 300, 311, 920, 746, 13, 51052], "temperature": 0.0, "avg_logprob": -0.16911016583022936, "compression_ratio": 1.7533333333333334, "no_speech_prob": 0.0015675356844440103}, {"id": 889, "seek": 5638, "start": 3610.14, "end": 3612.9, "text": " So this is all part of the debate here over", "tokens": [51052, 407, 341, 307, 439, 644, 295, 264, 7958, 510, 670, 51190], "temperature": 0.0, "avg_logprob": -0.16911016583022936, "compression_ratio": 1.7533333333333334, "no_speech_prob": 0.0015675356844440103}, {"id": 890, "seek": 5638, "start": 3612.9, "end": 3616.26, "text": " what is appropriate, how much compute should be allowed,", "tokens": [51190, 437, 307, 6854, 11, 577, 709, 14722, 820, 312, 4350, 11, 51358], "temperature": 0.0, "avg_logprob": -0.16911016583022936, "compression_ratio": 1.7533333333333334, "no_speech_prob": 0.0015675356844440103}, {"id": 891, "seek": 5638, "start": 3616.26, "end": 3617.86, "text": " even if it's overseas deployments", "tokens": [51358, 754, 498, 309, 311, 16274, 7274, 1117, 51438], "temperature": 0.0, "avg_logprob": -0.16911016583022936, "compression_ratio": 1.7533333333333334, "no_speech_prob": 0.0015675356844440103}, {"id": 892, "seek": 5638, "start": 3617.86, "end": 3620.82, "text": " that conserve Chinese customers by the way.", "tokens": [51438, 300, 45240, 4649, 4581, 538, 264, 636, 13, 51586], "temperature": 0.0, "avg_logprob": -0.16911016583022936, "compression_ratio": 1.7533333333333334, "no_speech_prob": 0.0015675356844440103}, {"id": 893, "seek": 5638, "start": 3620.82, "end": 3623.02, "text": " This is still a data center that can be used", "tokens": [51586, 639, 307, 920, 257, 1412, 3056, 300, 393, 312, 1143, 51696], "temperature": 0.0, "avg_logprob": -0.16911016583022936, "compression_ratio": 1.7533333333333334, "no_speech_prob": 0.0015675356844440103}, {"id": 894, "seek": 5638, "start": 3623.02, "end": 3624.42, "text": " to serve Chinese customers.", "tokens": [51696, 281, 4596, 4649, 4581, 13, 51766], "temperature": 0.0, "avg_logprob": -0.16911016583022936, "compression_ratio": 1.7533333333333334, "no_speech_prob": 0.0015675356844440103}, {"id": 895, "seek": 8442, "start": 3624.42, "end": 3627.34, "text": " So you could, in principle, at least to my understanding here,", "tokens": [50364, 407, 291, 727, 11, 294, 8665, 11, 412, 1935, 281, 452, 3701, 510, 11, 50510], "temperature": 0.0, "avg_logprob": -0.14779325593498688, "compression_ratio": 1.6430678466076696, "no_speech_prob": 0.0011602254817262292}, {"id": 896, "seek": 8442, "start": 3627.34, "end": 3629.3, "text": " might dance could use this to train a model", "tokens": [50510, 1062, 4489, 727, 764, 341, 281, 3847, 257, 2316, 50608], "temperature": 0.0, "avg_logprob": -0.14779325593498688, "compression_ratio": 1.6430678466076696, "no_speech_prob": 0.0011602254817262292}, {"id": 897, "seek": 8442, "start": 3629.3, "end": 3630.58, "text": " that then is used in China,", "tokens": [50608, 300, 550, 307, 1143, 294, 3533, 11, 50672], "temperature": 0.0, "avg_logprob": -0.14779325593498688, "compression_ratio": 1.6430678466076696, "no_speech_prob": 0.0011602254817262292}, {"id": 898, "seek": 8442, "start": 3630.58, "end": 3632.94, "text": " that then because of civil military fusion gets used", "tokens": [50672, 300, 550, 570, 295, 5605, 4632, 23100, 2170, 1143, 50790], "temperature": 0.0, "avg_logprob": -0.14779325593498688, "compression_ratio": 1.6430678466076696, "no_speech_prob": 0.0011602254817262292}, {"id": 899, "seek": 8442, "start": 3632.94, "end": 3634.74, "text": " by the CCP, you know, the MSS", "tokens": [50790, 538, 264, 27876, 11, 291, 458, 11, 264, 7395, 50, 50880], "temperature": 0.0, "avg_logprob": -0.14779325593498688, "compression_ratio": 1.6430678466076696, "no_speech_prob": 0.0011602254817262292}, {"id": 900, "seek": 8442, "start": 3634.74, "end": 3636.3, "text": " or whatever other arm of the Chinese state.", "tokens": [50880, 420, 2035, 661, 3726, 295, 264, 4649, 1785, 13, 50958], "temperature": 0.0, "avg_logprob": -0.14779325593498688, "compression_ratio": 1.6430678466076696, "no_speech_prob": 0.0011602254817262292}, {"id": 901, "seek": 8442, "start": 3636.3, "end": 3638.46, "text": " So they've also planned additional deployments", "tokens": [50958, 407, 436, 600, 611, 8589, 4497, 7274, 1117, 51066], "temperature": 0.0, "avg_logprob": -0.14779325593498688, "compression_ratio": 1.6430678466076696, "no_speech_prob": 0.0011602254817262292}, {"id": 902, "seek": 8442, "start": 3638.46, "end": 3642.62, "text": " of up to 7,000 B-200s at a bunch of data centers in Indonesia.", "tokens": [51066, 295, 493, 281, 1614, 11, 1360, 363, 12, 7629, 82, 412, 257, 3840, 295, 1412, 10898, 294, 16879, 13, 51274], "temperature": 0.0, "avg_logprob": -0.14779325593498688, "compression_ratio": 1.6430678466076696, "no_speech_prob": 0.0011602254817262292}, {"id": 903, "seek": 8442, "start": 3642.62, "end": 3646.86, "text": " This is kind of a broader Southeast Asian strategy here.", "tokens": [51274, 639, 307, 733, 295, 257, 13227, 27906, 10645, 5206, 510, 13, 51486], "temperature": 0.0, "avg_logprob": -0.14779325593498688, "compression_ratio": 1.6430678466076696, "no_speech_prob": 0.0011602254817262292}, {"id": 904, "seek": 8442, "start": 3646.86, "end": 3649.62, "text": " So we'll see, I mean, it's a pretty big,", "tokens": [51486, 407, 321, 603, 536, 11, 286, 914, 11, 309, 311, 257, 1238, 955, 11, 51624], "temperature": 0.0, "avg_logprob": -0.14779325593498688, "compression_ratio": 1.6430678466076696, "no_speech_prob": 0.0011602254817262292}, {"id": 905, "seek": 8442, "start": 3649.62, "end": 3651.38, "text": " you might think of it as a pretty big gap.", "tokens": [51624, 291, 1062, 519, 295, 309, 382, 257, 1238, 955, 7417, 13, 51712], "temperature": 0.0, "avg_logprob": -0.14779325593498688, "compression_ratio": 1.6430678466076696, "no_speech_prob": 0.0011602254817262292}, {"id": 906, "seek": 8442, "start": 3651.38, "end": 3653.34, "text": " That's sort of how I think of it personally,", "tokens": [51712, 663, 311, 1333, 295, 577, 286, 519, 295, 309, 5665, 11, 51810], "temperature": 0.0, "avg_logprob": -0.14779325593498688, "compression_ratio": 1.6430678466076696, "no_speech_prob": 0.0011602254817262292}, {"id": 907, "seek": 11334, "start": 3653.34, "end": 3655.66, "text": " but there are different views on is this desirable", "tokens": [50364, 457, 456, 366, 819, 6809, 322, 307, 341, 30533, 50480], "temperature": 0.0, "avg_logprob": -0.1417563572402828, "compression_ratio": 1.6, "no_speech_prob": 0.00807288009673357}, {"id": 908, "seek": 11334, "start": 3655.66, "end": 3657.34, "text": " and you could make an interesting argument", "tokens": [50480, 293, 291, 727, 652, 364, 1880, 6770, 50564], "temperature": 0.0, "avg_logprob": -0.1417563572402828, "compression_ratio": 1.6, "no_speech_prob": 0.00807288009673357}, {"id": 909, "seek": 11334, "start": 3657.34, "end": 3658.34, "text": " in many different directions.", "tokens": [50564, 294, 867, 819, 11095, 13, 50614], "temperature": 0.0, "avg_logprob": -0.1417563572402828, "compression_ratio": 1.6, "no_speech_prob": 0.00807288009673357}, {"id": 910, "seek": 11334, "start": 3658.34, "end": 3660.02, "text": " Moving back to the US,", "tokens": [50614, 14242, 646, 281, 264, 2546, 11, 50698], "temperature": 0.0, "avg_logprob": -0.1417563572402828, "compression_ratio": 1.6, "no_speech_prob": 0.00807288009673357}, {"id": 911, "seek": 11334, "start": 3660.02, "end": 3664.22, "text": " we've got an update on where meta is at with their AI efforts.", "tokens": [50698, 321, 600, 658, 364, 5623, 322, 689, 19616, 307, 412, 365, 641, 7318, 6484, 13, 50908], "temperature": 0.0, "avg_logprob": -0.1417563572402828, "compression_ratio": 1.6, "no_speech_prob": 0.00807288009673357}, {"id": 912, "seek": 11334, "start": 3664.22, "end": 3667.42, "text": " The update is that they are delaying the rollout", "tokens": [50908, 440, 5623, 307, 300, 436, 366, 8577, 278, 264, 3373, 346, 51068], "temperature": 0.0, "avg_logprob": -0.1417563572402828, "compression_ratio": 1.6, "no_speech_prob": 0.00807288009673357}, {"id": 913, "seek": 11334, "start": 3667.42, "end": 3670.66, "text": " of their next model, codename avocado,", "tokens": [51068, 295, 641, 958, 2316, 11, 17656, 268, 529, 27041, 11, 51230], "temperature": 0.0, "avg_logprob": -0.1417563572402828, "compression_ratio": 1.6, "no_speech_prob": 0.00807288009673357}, {"id": 914, "seek": 11334, "start": 3670.66, "end": 3672.62, "text": " from March to at least May.", "tokens": [51230, 490, 6129, 281, 412, 1935, 1891, 13, 51328], "temperature": 0.0, "avg_logprob": -0.1417563572402828, "compression_ratio": 1.6, "no_speech_prob": 0.00807288009673357}, {"id": 915, "seek": 11334, "start": 3672.62, "end": 3674.18, "text": " I think this was actually announced last week,", "tokens": [51328, 286, 519, 341, 390, 767, 7548, 1036, 1243, 11, 51406], "temperature": 0.0, "avg_logprob": -0.1417563572402828, "compression_ratio": 1.6, "no_speech_prob": 0.00807288009673357}, {"id": 916, "seek": 11334, "start": 3674.18, "end": 3676.66, "text": " but we didn't manage to cover it.", "tokens": [51406, 457, 321, 994, 380, 3067, 281, 2060, 309, 13, 51530], "temperature": 0.0, "avg_logprob": -0.1417563572402828, "compression_ratio": 1.6, "no_speech_prob": 0.00807288009673357}, {"id": 917, "seek": 11334, "start": 3676.66, "end": 3681.5, "text": " So they appear to have just not been able", "tokens": [51530, 407, 436, 4204, 281, 362, 445, 406, 668, 1075, 51772], "temperature": 0.0, "avg_logprob": -0.1417563572402828, "compression_ratio": 1.6, "no_speech_prob": 0.00807288009673357}, {"id": 918, "seek": 14150, "start": 3681.5, "end": 3684.5, "text": " to train a good enough model in time.", "tokens": [50364, 281, 3847, 257, 665, 1547, 2316, 294, 565, 13, 50514], "temperature": 0.0, "avg_logprob": -0.14789869410268378, "compression_ratio": 1.6046511627906976, "no_speech_prob": 0.008794163353741169}, {"id": 919, "seek": 14150, "start": 3684.5, "end": 3688.38, "text": " And so they are delaying this release.", "tokens": [50514, 400, 370, 436, 366, 8577, 278, 341, 4374, 13, 50708], "temperature": 0.0, "avg_logprob": -0.14789869410268378, "compression_ratio": 1.6046511627906976, "no_speech_prob": 0.008794163353741169}, {"id": 920, "seek": 14150, "start": 3688.38, "end": 3690.66, "text": " They are discussing or have discussed", "tokens": [50708, 814, 366, 10850, 420, 362, 7152, 50822], "temperature": 0.0, "avg_logprob": -0.14789869410268378, "compression_ratio": 1.6046511627906976, "no_speech_prob": 0.008794163353741169}, {"id": 921, "seek": 14150, "start": 3690.66, "end": 3694.9, "text": " at least temporarily licensing competitors AI technology,", "tokens": [50822, 412, 1935, 23750, 29759, 18333, 7318, 2899, 11, 51034], "temperature": 0.0, "avg_logprob": -0.14789869410268378, "compression_ratio": 1.6046511627906976, "no_speech_prob": 0.008794163353741169}, {"id": 922, "seek": 14150, "start": 3694.9, "end": 3696.98, "text": " report of the Google's Gemini.", "tokens": [51034, 2275, 295, 264, 3329, 311, 22894, 3812, 13, 51138], "temperature": 0.0, "avg_logprob": -0.14789869410268378, "compression_ratio": 1.6046511627906976, "no_speech_prob": 0.008794163353741169}, {"id": 923, "seek": 14150, "start": 3696.98, "end": 3701.5, "text": " They already are looking at various extensions.", "tokens": [51138, 814, 1217, 366, 1237, 412, 3683, 25129, 13, 51364], "temperature": 0.0, "avg_logprob": -0.14789869410268378, "compression_ratio": 1.6046511627906976, "no_speech_prob": 0.008794163353741169}, {"id": 924, "seek": 14150, "start": 3701.5, "end": 3706.06, "text": " So overall, not looking great for meta is my read on this.", "tokens": [51364, 407, 4787, 11, 406, 1237, 869, 337, 19616, 307, 452, 1401, 322, 341, 13, 51592], "temperature": 0.0, "avg_logprob": -0.14789869410268378, "compression_ratio": 1.6046511627906976, "no_speech_prob": 0.008794163353741169}, {"id": 925, "seek": 14150, "start": 3706.06, "end": 3708.78, "text": " It's already been close to a year.", "tokens": [51592, 467, 311, 1217, 668, 1998, 281, 257, 1064, 13, 51728], "temperature": 0.0, "avg_logprob": -0.14789869410268378, "compression_ratio": 1.6046511627906976, "no_speech_prob": 0.008794163353741169}, {"id": 926, "seek": 16878, "start": 3708.78, "end": 3712.9, "text": " It feels like in mid 2025, after a couple of months,", "tokens": [50364, 467, 3417, 411, 294, 2062, 39209, 11, 934, 257, 1916, 295, 2493, 11, 50570], "temperature": 0.0, "avg_logprob": -0.3152020735648072, "compression_ratio": 1.5121951219512195, "no_speech_prob": 0.07566747069358826}, {"id": 927, "seek": 16878, "start": 3712.9, "end": 3716.9, "text": " after a long before we saw this heavy, heavy, heavy push", "tokens": [50570, 934, 257, 938, 949, 321, 1866, 341, 4676, 11, 4676, 11, 4676, 2944, 50770], "temperature": 0.0, "avg_logprob": -0.3152020735648072, "compression_ratio": 1.5121951219512195, "no_speech_prob": 0.07566747069358826}, {"id": 928, "seek": 16878, "start": 3716.9, "end": 3720.22, "text": " where they got Alexander Reng from Scala AI", "tokens": [50770, 689, 436, 658, 14845, 497, 1501, 490, 2747, 5159, 7318, 50936], "temperature": 0.0, "avg_logprob": -0.3152020735648072, "compression_ratio": 1.5121951219512195, "no_speech_prob": 0.07566747069358826}, {"id": 929, "seek": 16878, "start": 3720.22, "end": 3722.5, "text": " and a $14 billion deal.", "tokens": [50936, 293, 257, 1848, 7271, 5218, 2028, 13, 51050], "temperature": 0.0, "avg_logprob": -0.3152020735648072, "compression_ratio": 1.5121951219512195, "no_speech_prob": 0.07566747069358826}, {"id": 930, "seek": 16878, "start": 3722.5, "end": 3726.34, "text": " And then they paid absurd money to get all of these researchers", "tokens": [51050, 400, 550, 436, 4835, 19774, 1460, 281, 483, 439, 295, 613, 10309, 51242], "temperature": 0.0, "avg_logprob": -0.3152020735648072, "compression_ratio": 1.5121951219512195, "no_speech_prob": 0.07566747069358826}, {"id": 931, "seek": 16878, "start": 3726.34, "end": 3729.9, "text": " from Unphropic and OPAI and DeepMind and so on.", "tokens": [51242, 490, 1156, 950, 39173, 293, 422, 10297, 40, 293, 14895, 44, 471, 293, 370, 322, 13, 51420], "temperature": 0.0, "avg_logprob": -0.3152020735648072, "compression_ratio": 1.5121951219512195, "no_speech_prob": 0.07566747069358826}, {"id": 932, "seek": 16878, "start": 3729.9, "end": 3732.66, "text": " And they created super intelligence labs", "tokens": [51420, 400, 436, 2942, 1687, 7599, 20339, 51558], "temperature": 0.0, "avg_logprob": -0.3152020735648072, "compression_ratio": 1.5121951219512195, "no_speech_prob": 0.07566747069358826}, {"id": 933, "seek": 16878, "start": 3732.66, "end": 3735.14, "text": " with a company very reshuffled everything", "tokens": [51558, 365, 257, 2237, 588, 725, 71, 33974, 1203, 51682], "temperature": 0.0, "avg_logprob": -0.3152020735648072, "compression_ratio": 1.5121951219512195, "no_speech_prob": 0.07566747069358826}, {"id": 934, "seek": 19514, "start": 3735.14, "end": 3741.02, "text": " with the intent to train a competitive frontier model", "tokens": [50364, 365, 264, 8446, 281, 3847, 257, 10043, 35853, 2316, 50658], "temperature": 0.0, "avg_logprob": -0.24064079414193446, "compression_ratio": 1.5061728395061729, "no_speech_prob": 0.0037431453820317984}, {"id": 935, "seek": 19514, "start": 3741.02, "end": 3745.66, "text": " that can stand up to GPU 5.4 and cloud and all these other ones.", "tokens": [50658, 300, 393, 1463, 493, 281, 18407, 1025, 13, 19, 293, 4588, 293, 439, 613, 661, 2306, 13, 50890], "temperature": 0.0, "avg_logprob": -0.24064079414193446, "compression_ratio": 1.5061728395061729, "no_speech_prob": 0.0037431453820317984}, {"id": 936, "seek": 19514, "start": 3745.66, "end": 3747.7, "text": " It doesn't seem like they're there.", "tokens": [50890, 467, 1177, 380, 1643, 411, 436, 434, 456, 13, 50992], "temperature": 0.0, "avg_logprob": -0.24064079414193446, "compression_ratio": 1.5061728395061729, "no_speech_prob": 0.0037431453820317984}, {"id": 937, "seek": 19514, "start": 3747.7, "end": 3750.38, "text": " And I personally, this is my read.", "tokens": [50992, 400, 286, 5665, 11, 341, 307, 452, 1401, 13, 51126], "temperature": 0.0, "avg_logprob": -0.24064079414193446, "compression_ratio": 1.5061728395061729, "no_speech_prob": 0.0037431453820317984}, {"id": 938, "seek": 19514, "start": 3750.38, "end": 3755.42, "text": " I think organizationally having Alexander Reng lead it", "tokens": [51126, 286, 519, 4475, 379, 1419, 14845, 497, 1501, 1477, 309, 51378], "temperature": 0.0, "avg_logprob": -0.24064079414193446, "compression_ratio": 1.5061728395061729, "no_speech_prob": 0.0037431453820317984}, {"id": 939, "seek": 19514, "start": 3755.42, "end": 3757.82, "text": " is maybe not the best idea.", "tokens": [51378, 307, 1310, 406, 264, 1151, 1558, 13, 51498], "temperature": 0.0, "avg_logprob": -0.24064079414193446, "compression_ratio": 1.5061728395061729, "no_speech_prob": 0.0037431453820317984}, {"id": 940, "seek": 19514, "start": 3757.82, "end": 3760.94, "text": " Someone who hasn't worked in Frontier AI models,", "tokens": [51498, 8734, 567, 6132, 380, 2732, 294, 17348, 811, 7318, 5245, 11, 51654], "temperature": 0.0, "avg_logprob": -0.24064079414193446, "compression_ratio": 1.5061728395061729, "no_speech_prob": 0.0037431453820317984}, {"id": 941, "seek": 19514, "start": 3760.94, "end": 3764.14, "text": " data curation of Scala AI, you know, that's-", "tokens": [51654, 1412, 1262, 399, 295, 2747, 5159, 7318, 11, 291, 458, 11, 300, 311, 12, 51814], "temperature": 0.0, "avg_logprob": -0.24064079414193446, "compression_ratio": 1.5061728395061729, "no_speech_prob": 0.0037431453820317984}, {"id": 942, "seek": 22414, "start": 3764.14, "end": 3765.66, "text": " It's unrelated, right?", "tokens": [50364, 467, 311, 38967, 11, 558, 30, 50440], "temperature": 0.0, "avg_logprob": -0.24424235883259005, "compression_ratio": 1.6178861788617886, "no_speech_prob": 0.0036583866458386183}, {"id": 943, "seek": 22414, "start": 3765.66, "end": 3770.18, "text": " It's related, but I don't know if that's enough.", "tokens": [50440, 467, 311, 4077, 11, 457, 286, 500, 380, 458, 498, 300, 311, 1547, 13, 50666], "temperature": 0.0, "avg_logprob": -0.24424235883259005, "compression_ratio": 1.6178861788617886, "no_speech_prob": 0.0036583866458386183}, {"id": 944, "seek": 22414, "start": 3770.18, "end": 3773.46, "text": " Anyway, so that's when it's been delayed.", "tokens": [50666, 5684, 11, 370, 300, 311, 562, 309, 311, 668, 20268, 13, 50830], "temperature": 0.0, "avg_logprob": -0.24424235883259005, "compression_ratio": 1.6178861788617886, "no_speech_prob": 0.0036583866458386183}, {"id": 945, "seek": 22414, "start": 3773.46, "end": 3776.02, "text": " It doesn't appear like they're on track,", "tokens": [50830, 467, 1177, 380, 4204, 411, 436, 434, 322, 2837, 11, 50958], "temperature": 0.0, "avg_logprob": -0.24424235883259005, "compression_ratio": 1.6178861788617886, "no_speech_prob": 0.0036583866458386183}, {"id": 946, "seek": 22414, "start": 3776.02, "end": 3778.1, "text": " but you don't know too much.", "tokens": [50958, 457, 291, 500, 380, 458, 886, 709, 13, 51062], "temperature": 0.0, "avg_logprob": -0.24424235883259005, "compression_ratio": 1.6178861788617886, "no_speech_prob": 0.0036583866458386183}, {"id": 947, "seek": 22414, "start": 3778.1, "end": 3779.06, "text": " This is all internal.", "tokens": [51062, 639, 307, 439, 6920, 13, 51110], "temperature": 0.0, "avg_logprob": -0.24424235883259005, "compression_ratio": 1.6178861788617886, "no_speech_prob": 0.0036583866458386183}, {"id": 948, "seek": 22414, "start": 3779.06, "end": 3783.18, "text": " Maybe they're just doubling down and it's going to be great.", "tokens": [51110, 2704, 436, 434, 445, 33651, 760, 293, 309, 311, 516, 281, 312, 869, 13, 51316], "temperature": 0.0, "avg_logprob": -0.24424235883259005, "compression_ratio": 1.6178861788617886, "no_speech_prob": 0.0036583866458386183}, {"id": 949, "seek": 22414, "start": 3783.18, "end": 3783.94, "text": " Yeah, boy, yeah, right.", "tokens": [51316, 865, 11, 3237, 11, 1338, 11, 558, 13, 51354], "temperature": 0.0, "avg_logprob": -0.24424235883259005, "compression_ratio": 1.6178861788617886, "no_speech_prob": 0.0036583866458386183}, {"id": 950, "seek": 22414, "start": 3783.94, "end": 3786.3, "text": " And it looks like it, so reportedly like", "tokens": [51354, 400, 309, 1542, 411, 309, 11, 370, 23989, 411, 51472], "temperature": 0.0, "avg_logprob": -0.24424235883259005, "compression_ratio": 1.6178861788617886, "no_speech_prob": 0.0036583866458386183}, {"id": 951, "seek": 22414, "start": 3786.3, "end": 3789.82, "text": " landing somewhere in capabilities between Gemini 2.5", "tokens": [51472, 11202, 4079, 294, 10862, 1296, 22894, 3812, 568, 13, 20, 51648], "temperature": 0.0, "avg_logprob": -0.24424235883259005, "compression_ratio": 1.6178861788617886, "no_speech_prob": 0.0036583866458386183}, {"id": 952, "seek": 22414, "start": 3789.82, "end": 3790.9, "text": " and Gemini 3.", "tokens": [51648, 293, 22894, 3812, 805, 13, 51702], "temperature": 0.0, "avg_logprob": -0.24424235883259005, "compression_ratio": 1.6178861788617886, "no_speech_prob": 0.0036583866458386183}, {"id": 953, "seek": 25090, "start": 3790.9, "end": 3793.94, "text": " So, you know, very passe at this point,", "tokens": [50364, 407, 11, 291, 458, 11, 588, 14530, 412, 341, 935, 11, 50516], "temperature": 0.0, "avg_logprob": -0.20038779614106664, "compression_ratio": 1.678125, "no_speech_prob": 0.013125121593475342}, {"id": 954, "seek": 25090, "start": 3793.94, "end": 3795.62, "text": " hence this decision to delay.", "tokens": [50516, 16678, 341, 3537, 281, 8577, 13, 50600], "temperature": 0.0, "avg_logprob": -0.20038779614106664, "compression_ratio": 1.678125, "no_speech_prob": 0.013125121593475342}, {"id": 955, "seek": 25090, "start": 3795.62, "end": 3798.98, "text": " They're angling for a private non-open source model here, right?", "tokens": [50600, 814, 434, 2562, 1688, 337, 257, 4551, 2107, 12, 15752, 4009, 2316, 510, 11, 558, 30, 50768], "temperature": 0.0, "avg_logprob": -0.20038779614106664, "compression_ratio": 1.678125, "no_speech_prob": 0.013125121593475342}, {"id": 956, "seek": 25090, "start": 3798.98, "end": 3800.82, "text": " So the appropriate point of comparison", "tokens": [50768, 407, 264, 6854, 935, 295, 9660, 50860], "temperature": 0.0, "avg_logprob": -0.20038779614106664, "compression_ratio": 1.678125, "no_speech_prob": 0.013125121593475342}, {"id": 957, "seek": 25090, "start": 3800.82, "end": 3802.82, "text": " is what does cloud look like?", "tokens": [50860, 307, 437, 775, 4588, 574, 411, 30, 50960], "temperature": 0.0, "avg_logprob": -0.20038779614106664, "compression_ratio": 1.678125, "no_speech_prob": 0.013125121593475342}, {"id": 958, "seek": 25090, "start": 3802.82, "end": 3804.5, "text": " You know, what is the best version of cloud look like?", "tokens": [50960, 509, 458, 11, 437, 307, 264, 1151, 3037, 295, 4588, 574, 411, 30, 51044], "temperature": 0.0, "avg_logprob": -0.20038779614106664, "compression_ratio": 1.678125, "no_speech_prob": 0.013125121593475342}, {"id": 959, "seek": 25090, "start": 3804.5, "end": 3807.5, "text": " It seems like, as you say, it's just not up to it,", "tokens": [51044, 467, 2544, 411, 11, 382, 291, 584, 11, 309, 311, 445, 406, 493, 281, 309, 11, 51194], "temperature": 0.0, "avg_logprob": -0.20038779614106664, "compression_ratio": 1.678125, "no_speech_prob": 0.013125121593475342}, {"id": 960, "seek": 25090, "start": 3807.5, "end": 3808.42, "text": " at least for right now.", "tokens": [51194, 412, 1935, 337, 558, 586, 13, 51240], "temperature": 0.0, "avg_logprob": -0.20038779614106664, "compression_ratio": 1.678125, "no_speech_prob": 0.013125121593475342}, {"id": 961, "seek": 25090, "start": 3808.42, "end": 3810.18, "text": " And that could change quickly, who knows?", "tokens": [51240, 400, 300, 727, 1319, 2661, 11, 567, 3255, 30, 51328], "temperature": 0.0, "avg_logprob": -0.20038779614106664, "compression_ratio": 1.678125, "no_speech_prob": 0.013125121593475342}, {"id": 962, "seek": 25090, "start": 3810.18, "end": 3813.42, "text": " But yeah, it's also part of this whole saga", "tokens": [51328, 583, 1338, 11, 309, 311, 611, 644, 295, 341, 1379, 34250, 51490], "temperature": 0.0, "avg_logprob": -0.20038779614106664, "compression_ratio": 1.678125, "no_speech_prob": 0.013125121593475342}, {"id": 963, "seek": 25090, "start": 3813.42, "end": 3816.78, "text": " where you had Yan Likun say, you know,", "tokens": [51490, 689, 291, 632, 13633, 441, 1035, 409, 584, 11, 291, 458, 11, 51658], "temperature": 0.0, "avg_logprob": -0.20038779614106664, "compression_ratio": 1.678125, "no_speech_prob": 0.013125121593475342}, {"id": 964, "seek": 25090, "start": 3816.78, "end": 3819.1, "text": " fuck Alex Wang when he was leaving.", "tokens": [51658, 3275, 5202, 14499, 562, 415, 390, 5012, 13, 51774], "temperature": 0.0, "avg_logprob": -0.20038779614106664, "compression_ratio": 1.678125, "no_speech_prob": 0.013125121593475342}, {"id": 965, "seek": 25090, "start": 3819.1, "end": 3820.26, "text": " And then everybody was, there's just like,", "tokens": [51774, 400, 550, 2201, 390, 11, 456, 311, 445, 411, 11, 51832], "temperature": 0.0, "avg_logprob": -0.20038779614106664, "compression_ratio": 1.678125, "no_speech_prob": 0.013125121593475342}, {"id": 966, "seek": 28026, "start": 3820.26, "end": 3821.5, "text": " I forget, not times of India,", "tokens": [50364, 286, 2870, 11, 406, 1413, 295, 5282, 11, 50426], "temperature": 0.0, "avg_logprob": -0.35963342600855336, "compression_ratio": 1.6081081081081081, "no_speech_prob": 0.0014086629962548614}, {"id": 967, "seek": 28026, "start": 3821.5, "end": 3823.34, "text": " but there was some kind of India-based publication, right,", "tokens": [50426, 457, 456, 390, 512, 733, 295, 5282, 12, 6032, 19953, 11, 558, 11, 50518], "temperature": 0.0, "avg_logprob": -0.35963342600855336, "compression_ratio": 1.6081081081081081, "no_speech_prob": 0.0014086629962548614}, {"id": 968, "seek": 28026, "start": 3823.34, "end": 3826.9, "text": " that said, hey, looks like Zuck is thinking about Marjorie.", "tokens": [50518, 300, 848, 11, 4177, 11, 1542, 411, 1176, 1134, 307, 1953, 466, 2039, 2337, 414, 13, 50696], "temperature": 0.0, "avg_logprob": -0.35963342600855336, "compression_ratio": 1.6081081081081081, "no_speech_prob": 0.0014086629962548614}, {"id": 969, "seek": 28026, "start": 3826.9, "end": 3829.7, "text": " Which was like not verified.", "tokens": [50696, 3013, 390, 411, 406, 31197, 13, 50836], "temperature": 0.0, "avg_logprob": -0.35963342600855336, "compression_ratio": 1.6081081081081081, "no_speech_prob": 0.0014086629962548614}, {"id": 970, "seek": 28026, "start": 3829.7, "end": 3830.94, "text": " It was false.", "tokens": [50836, 467, 390, 7908, 13, 50898], "temperature": 0.0, "avg_logprob": -0.35963342600855336, "compression_ratio": 1.6081081081081081, "no_speech_prob": 0.0014086629962548614}, {"id": 971, "seek": 28026, "start": 3830.94, "end": 3832.1, "text": " Yeah, not false.", "tokens": [50898, 865, 11, 406, 7908, 13, 50956], "temperature": 0.0, "avg_logprob": -0.35963342600855336, "compression_ratio": 1.6081081081081081, "no_speech_prob": 0.0014086629962548614}, {"id": 972, "seek": 28026, "start": 3832.1, "end": 3832.9, "text": " But yeah, exactly.", "tokens": [50956, 583, 1338, 11, 2293, 13, 50996], "temperature": 0.0, "avg_logprob": -0.35963342600855336, "compression_ratio": 1.6081081081081081, "no_speech_prob": 0.0014086629962548614}, {"id": 973, "seek": 28026, "start": 3832.9, "end": 3835.62, "text": " And then Zuck came out with a photo with Alex Wang,", "tokens": [50996, 400, 550, 1176, 1134, 1361, 484, 365, 257, 5052, 365, 5202, 14499, 11, 51132], "temperature": 0.0, "avg_logprob": -0.35963342600855336, "compression_ratio": 1.6081081081081081, "no_speech_prob": 0.0014086629962548614}, {"id": 974, "seek": 28026, "start": 3835.62, "end": 3837.7799999999997, "text": " sort of all smiles in the office type thing, to kind of.", "tokens": [51132, 1333, 295, 439, 28083, 294, 264, 3398, 2010, 551, 11, 281, 733, 295, 13, 51240], "temperature": 0.0, "avg_logprob": -0.35963342600855336, "compression_ratio": 1.6081081081081081, "no_speech_prob": 0.0014086629962548614}, {"id": 975, "seek": 28026, "start": 3837.7799999999997, "end": 3840.82, "text": " Now, we do know, yeah, that aside from Yan Likun,", "tokens": [51240, 823, 11, 321, 360, 458, 11, 1338, 11, 300, 7359, 490, 13633, 441, 1035, 409, 11, 51392], "temperature": 0.0, "avg_logprob": -0.35963342600855336, "compression_ratio": 1.6081081081081081, "no_speech_prob": 0.0014086629962548614}, {"id": 976, "seek": 28026, "start": 3840.82, "end": 3845.82, "text": " Alexa Rang has also clashed with internal kind of big deal", "tokens": [51392, 22595, 497, 656, 575, 611, 596, 12219, 365, 6920, 733, 295, 955, 2028, 51642], "temperature": 0.0, "avg_logprob": -0.35963342600855336, "compression_ratio": 1.6081081081081081, "no_speech_prob": 0.0014086629962548614}, {"id": 977, "seek": 28026, "start": 3845.82, "end": 3847.66, "text": " product people like Chris Cox,", "tokens": [51642, 1674, 561, 411, 6688, 41576, 11, 51734], "temperature": 0.0, "avg_logprob": -0.35963342600855336, "compression_ratio": 1.6081081081081081, "no_speech_prob": 0.0014086629962548614}, {"id": 978, "seek": 30766, "start": 3847.66, "end": 3849.66, "text": " Meta's Chief Product Officer,", "tokens": [50364, 6377, 64, 311, 10068, 22005, 15434, 11, 50464], "temperature": 0.0, "avg_logprob": -0.23553818070462773, "compression_ratio": 1.6464968152866242, "no_speech_prob": 0.03465258702635765}, {"id": 979, "seek": 30766, "start": 3849.66, "end": 3853.34, "text": " and Andrew Bosworth, their Chief Technology Officer.", "tokens": [50464, 293, 10110, 22264, 13136, 11, 641, 10068, 15037, 15434, 13, 50648], "temperature": 0.0, "avg_logprob": -0.23553818070462773, "compression_ratio": 1.6464968152866242, "no_speech_prob": 0.03465258702635765}, {"id": 980, "seek": 30766, "start": 3853.34, "end": 3856.9, "text": " Apparently, about how AI models should improve,", "tokens": [50648, 16755, 11, 466, 577, 7318, 5245, 820, 3470, 11, 50826], "temperature": 0.0, "avg_logprob": -0.23553818070462773, "compression_ratio": 1.6464968152866242, "no_speech_prob": 0.03465258702635765}, {"id": 981, "seek": 30766, "start": 3856.9, "end": 3858.5, "text": " Alexa Rang has been pushing for,", "tokens": [50826, 22595, 497, 656, 575, 668, 7380, 337, 11, 50906], "temperature": 0.0, "avg_logprob": -0.23553818070462773, "compression_ratio": 1.6464968152866242, "no_speech_prob": 0.03465258702635765}, {"id": 982, "seek": 30766, "start": 3858.5, "end": 3860.54, "text": " let's just train a super, super good model", "tokens": [50906, 718, 311, 445, 3847, 257, 1687, 11, 1687, 665, 2316, 51008], "temperature": 0.0, "avg_logprob": -0.23553818070462773, "compression_ratio": 1.6464968152866242, "no_speech_prob": 0.03465258702635765}, {"id": 983, "seek": 30766, "start": 3860.54, "end": 3862.9, "text": " and not care about the business implications", "tokens": [51008, 293, 406, 1127, 466, 264, 1606, 16602, 51126], "temperature": 0.0, "avg_logprob": -0.23553818070462773, "compression_ratio": 1.6464968152866242, "no_speech_prob": 0.03465258702635765}, {"id": 984, "seek": 30766, "start": 3862.9, "end": 3864.58, "text": " and the product implications,", "tokens": [51126, 293, 264, 1674, 16602, 11, 51210], "temperature": 0.0, "avg_logprob": -0.23553818070462773, "compression_ratio": 1.6464968152866242, "no_speech_prob": 0.03465258702635765}, {"id": 985, "seek": 30766, "start": 3864.58, "end": 3869.14, "text": " which other people at Meta understandably are questioning.", "tokens": [51210, 597, 661, 561, 412, 6377, 64, 1223, 1188, 366, 21257, 13, 51438], "temperature": 0.0, "avg_logprob": -0.23553818070462773, "compression_ratio": 1.6464968152866242, "no_speech_prob": 0.03465258702635765}, {"id": 986, "seek": 30766, "start": 3869.14, "end": 3872.06, "text": " Yeah, and it's exactly, now Alex comes from this viewpoint", "tokens": [51438, 865, 11, 293, 309, 311, 2293, 11, 586, 5202, 1487, 490, 341, 35248, 51584], "temperature": 0.0, "avg_logprob": -0.23553818070462773, "compression_ratio": 1.6464968152866242, "no_speech_prob": 0.03465258702635765}, {"id": 987, "seek": 30766, "start": 3872.06, "end": 3873.3, "text": " of, hey, let's do the opening.", "tokens": [51584, 295, 11, 4177, 11, 718, 311, 360, 264, 5193, 13, 51646], "temperature": 0.0, "avg_logprob": -0.23553818070462773, "compression_ratio": 1.6464968152866242, "no_speech_prob": 0.03465258702635765}, {"id": 988, "seek": 30766, "start": 3873.3, "end": 3875.3, "text": " I think let's do the anthropic thing", "tokens": [51646, 286, 519, 718, 311, 360, 264, 22727, 299, 551, 51746], "temperature": 0.0, "avg_logprob": -0.23553818070462773, "compression_ratio": 1.6464968152866242, "no_speech_prob": 0.03465258702635765}, {"id": 989, "seek": 30766, "start": 3875.3, "end": 3877.5, "text": " and go full on, you know, AGI and all that stuff.", "tokens": [51746, 293, 352, 1577, 322, 11, 291, 458, 11, 316, 26252, 293, 439, 300, 1507, 13, 51856], "temperature": 0.0, "avg_logprob": -0.23553818070462773, "compression_ratio": 1.6464968152866242, "no_speech_prob": 0.03465258702635765}, {"id": 990, "seek": 33750, "start": 3877.5, "end": 3879.74, "text": " That's very much, I mean, he is ASI-pilled.", "tokens": [50364, 663, 311, 588, 709, 11, 286, 914, 11, 415, 307, 7469, 40, 12, 79, 6261, 13, 50476], "temperature": 0.0, "avg_logprob": -0.1848567557458433, "compression_ratio": 1.9020172910662825, "no_speech_prob": 0.00160885916557163}, {"id": 991, "seek": 33750, "start": 3879.74, "end": 3882.1, "text": " Well, the name of the super intelligence team kind of hints", "tokens": [50476, 1042, 11, 264, 1315, 295, 264, 1687, 7599, 1469, 733, 295, 27271, 50594], "temperature": 0.0, "avg_logprob": -0.1848567557458433, "compression_ratio": 1.9020172910662825, "no_speech_prob": 0.00160885916557163}, {"id": 992, "seek": 33750, "start": 3882.1, "end": 3883.58, "text": " at that, and not only, I mean,", "tokens": [50594, 412, 300, 11, 293, 406, 787, 11, 286, 914, 11, 50668], "temperature": 0.0, "avg_logprob": -0.1848567557458433, "compression_ratio": 1.9020172910662825, "no_speech_prob": 0.00160885916557163}, {"id": 993, "seek": 33750, "start": 3883.58, "end": 3885.38, "text": " if you're going to make a super intelligence team,", "tokens": [50668, 498, 291, 434, 516, 281, 652, 257, 1687, 7599, 1469, 11, 50758], "temperature": 0.0, "avg_logprob": -0.1848567557458433, "compression_ratio": 1.9020172910662825, "no_speech_prob": 0.00160885916557163}, {"id": 994, "seek": 33750, "start": 3885.38, "end": 3887.34, "text": " yeah, they probably should be focused on super intelligence.", "tokens": [50758, 1338, 11, 436, 1391, 820, 312, 5178, 322, 1687, 7599, 13, 50856], "temperature": 0.0, "avg_logprob": -0.1848567557458433, "compression_ratio": 1.9020172910662825, "no_speech_prob": 0.00160885916557163}, {"id": 995, "seek": 33750, "start": 3887.34, "end": 3889.14, "text": " Like, I'm no branding expert.", "tokens": [50856, 1743, 11, 286, 478, 572, 27279, 5844, 13, 50946], "temperature": 0.0, "avg_logprob": -0.1848567557458433, "compression_ratio": 1.9020172910662825, "no_speech_prob": 0.00160885916557163}, {"id": 996, "seek": 33750, "start": 3889.14, "end": 3892.7799999999997, "text": " I'm not marketing dude, but yeah, it seems like that's par.", "tokens": [50946, 286, 478, 406, 6370, 6449, 11, 457, 1338, 11, 309, 2544, 411, 300, 311, 971, 13, 51128], "temperature": 0.0, "avg_logprob": -0.1848567557458433, "compression_ratio": 1.9020172910662825, "no_speech_prob": 0.00160885916557163}, {"id": 997, "seek": 33750, "start": 3892.7799999999997, "end": 3895.42, "text": " Now, should Meta have a super intelligence team", "tokens": [51128, 823, 11, 820, 6377, 64, 362, 257, 1687, 7599, 1469, 51260], "temperature": 0.0, "avg_logprob": -0.1848567557458433, "compression_ratio": 1.9020172910662825, "no_speech_prob": 0.00160885916557163}, {"id": 998, "seek": 33750, "start": 3895.42, "end": 3896.42, "text": " separate questions, right?", "tokens": [51260, 4994, 1651, 11, 558, 30, 51310], "temperature": 0.0, "avg_logprob": -0.1848567557458433, "compression_ratio": 1.9020172910662825, "no_speech_prob": 0.00160885916557163}, {"id": 999, "seek": 33750, "start": 3896.42, "end": 3898.3, "text": " That's a good question to ask, I don't know.", "tokens": [51310, 663, 311, 257, 665, 1168, 281, 1029, 11, 286, 500, 380, 458, 13, 51404], "temperature": 0.0, "avg_logprob": -0.1848567557458433, "compression_ratio": 1.9020172910662825, "no_speech_prob": 0.00160885916557163}, {"id": 1000, "seek": 33750, "start": 3898.3, "end": 3899.14, "text": " That's right.", "tokens": [51404, 663, 311, 558, 13, 51446], "temperature": 0.0, "avg_logprob": -0.1848567557458433, "compression_ratio": 1.9020172910662825, "no_speech_prob": 0.00160885916557163}, {"id": 1001, "seek": 33750, "start": 3899.14, "end": 3900.3, "text": " And if, you know, certainly,", "tokens": [51446, 400, 498, 11, 291, 458, 11, 3297, 11, 51504], "temperature": 0.0, "avg_logprob": -0.1848567557458433, "compression_ratio": 1.9020172910662825, "no_speech_prob": 0.00160885916557163}, {"id": 1002, "seek": 33750, "start": 3900.3, "end": 3902.18, "text": " well, if you believe that super intelligence", "tokens": [51504, 731, 11, 498, 291, 1697, 300, 1687, 7599, 51598], "temperature": 0.0, "avg_logprob": -0.1848567557458433, "compression_ratio": 1.9020172910662825, "no_speech_prob": 0.00160885916557163}, {"id": 1003, "seek": 33750, "start": 3902.18, "end": 3904.34, "text": " is going to be the thing that Alex Wang believes it is", "tokens": [51598, 307, 516, 281, 312, 264, 551, 300, 5202, 14499, 12307, 309, 307, 51706], "temperature": 0.0, "avg_logprob": -0.1848567557458433, "compression_ratio": 1.9020172910662825, "no_speech_prob": 0.00160885916557163}, {"id": 1004, "seek": 33750, "start": 3904.34, "end": 3906.02, "text": " and that maybe Zuck believes it is,", "tokens": [51706, 293, 300, 1310, 1176, 1134, 12307, 309, 307, 11, 51790], "temperature": 0.0, "avg_logprob": -0.1848567557458433, "compression_ratio": 1.9020172910662825, "no_speech_prob": 0.00160885916557163}, {"id": 1005, "seek": 33750, "start": 3906.02, "end": 3906.86, "text": " then you have no choice.", "tokens": [51790, 550, 291, 362, 572, 3922, 13, 51832], "temperature": 0.0, "avg_logprob": -0.1848567557458433, "compression_ratio": 1.9020172910662825, "no_speech_prob": 0.00160885916557163}, {"id": 1006, "seek": 36686, "start": 3906.9, "end": 3908.38, "text": " You have to compete because somebody,", "tokens": [50366, 509, 362, 281, 11831, 570, 2618, 11, 50440], "temperature": 0.0, "avg_logprob": -0.1780192039229653, "compression_ratio": 1.7467948717948718, "no_speech_prob": 0.0032623878214508295}, {"id": 1007, "seek": 36686, "start": 3908.38, "end": 3910.58, "text": " if somebody else does it, you're finished,", "tokens": [50440, 498, 2618, 1646, 775, 309, 11, 291, 434, 4335, 11, 50550], "temperature": 0.0, "avg_logprob": -0.1780192039229653, "compression_ratio": 1.7467948717948718, "no_speech_prob": 0.0032623878214508295}, {"id": 1008, "seek": 36686, "start": 3910.58, "end": 3912.7, "text": " depending on how the intelligence explosion goes.", "tokens": [50550, 5413, 322, 577, 264, 7599, 15673, 1709, 13, 50656], "temperature": 0.0, "avg_logprob": -0.1780192039229653, "compression_ratio": 1.7467948717948718, "no_speech_prob": 0.0032623878214508295}, {"id": 1009, "seek": 36686, "start": 3912.7, "end": 3915.18, "text": " Bottom line is a lot of fog of war,", "tokens": [50656, 38289, 1622, 307, 257, 688, 295, 13648, 295, 1516, 11, 50780], "temperature": 0.0, "avg_logprob": -0.1780192039229653, "compression_ratio": 1.7467948717948718, "no_speech_prob": 0.0032623878214508295}, {"id": 1010, "seek": 36686, "start": 3915.18, "end": 3917.78, "text": " at least to me, still on what's going on inside Meta,", "tokens": [50780, 412, 1935, 281, 385, 11, 920, 322, 437, 311, 516, 322, 1854, 6377, 64, 11, 50910], "temperature": 0.0, "avg_logprob": -0.1780192039229653, "compression_ratio": 1.7467948717948718, "no_speech_prob": 0.0032623878214508295}, {"id": 1011, "seek": 36686, "start": 3917.78, "end": 3920.74, "text": " the one data point we have is that we still don't have anything", "tokens": [50910, 264, 472, 1412, 935, 321, 362, 307, 300, 321, 920, 500, 380, 362, 1340, 51058], "temperature": 0.0, "avg_logprob": -0.1780192039229653, "compression_ratio": 1.7467948717948718, "no_speech_prob": 0.0032623878214508295}, {"id": 1012, "seek": 36686, "start": 3920.74, "end": 3922.98, "text": " and that it's starting to become a real data point", "tokens": [51058, 293, 300, 309, 311, 2891, 281, 1813, 257, 957, 1412, 935, 51170], "temperature": 0.0, "avg_logprob": -0.1780192039229653, "compression_ratio": 1.7467948717948718, "no_speech_prob": 0.0032623878214508295}, {"id": 1013, "seek": 36686, "start": 3922.98, "end": 3923.82, "text": " at this point.", "tokens": [51170, 412, 341, 935, 13, 51212], "temperature": 0.0, "avg_logprob": -0.1780192039229653, "compression_ratio": 1.7467948717948718, "no_speech_prob": 0.0032623878214508295}, {"id": 1014, "seek": 36686, "start": 3923.82, "end": 3926.26, "text": " It's starting to come like definitely expected", "tokens": [51212, 467, 311, 2891, 281, 808, 411, 2138, 5176, 51334], "temperature": 0.0, "avg_logprob": -0.1780192039229653, "compression_ratio": 1.7467948717948718, "no_speech_prob": 0.0032623878214508295}, {"id": 1015, "seek": 36686, "start": 3926.26, "end": 3930.38, "text": " a model release by now, given the amount of infrastructure", "tokens": [51334, 257, 2316, 4374, 538, 586, 11, 2212, 264, 2372, 295, 6896, 51540], "temperature": 0.0, "avg_logprob": -0.1780192039229653, "compression_ratio": 1.7467948717948718, "no_speech_prob": 0.0032623878214508295}, {"id": 1016, "seek": 36686, "start": 3930.38, "end": 3932.74, "text": " and investment they've put in here.", "tokens": [51540, 293, 6078, 436, 600, 829, 294, 510, 13, 51658], "temperature": 0.0, "avg_logprob": -0.1780192039229653, "compression_ratio": 1.7467948717948718, "no_speech_prob": 0.0032623878214508295}, {"id": 1017, "seek": 36686, "start": 3932.74, "end": 3936.78, "text": " And given that they have trained models in the past,", "tokens": [51658, 400, 2212, 300, 436, 362, 8895, 5245, 294, 264, 1791, 11, 51860], "temperature": 0.0, "avg_logprob": -0.1780192039229653, "compression_ratio": 1.7467948717948718, "no_speech_prob": 0.0032623878214508295}, {"id": 1018, "seek": 39678, "start": 3936.78, "end": 3939.54, "text": " number four was overwhelming,", "tokens": [50364, 1230, 1451, 390, 13373, 11, 50502], "temperature": 0.0, "avg_logprob": -0.18274155294193942, "compression_ratio": 1.4796380090497738, "no_speech_prob": 0.0008895110222510993}, {"id": 1019, "seek": 39678, "start": 3939.54, "end": 3942.18, "text": " but it was a pretty impressive model", "tokens": [50502, 457, 309, 390, 257, 1238, 8992, 2316, 50634], "temperature": 0.0, "avg_logprob": -0.18274155294193942, "compression_ratio": 1.4796380090497738, "no_speech_prob": 0.0008895110222510993}, {"id": 1020, "seek": 39678, "start": 3942.18, "end": 3945.98, "text": " that most companies were not able to produce even still.", "tokens": [50634, 300, 881, 3431, 645, 406, 1075, 281, 5258, 754, 920, 13, 50824], "temperature": 0.0, "avg_logprob": -0.18274155294193942, "compression_ratio": 1.4796380090497738, "no_speech_prob": 0.0008895110222510993}, {"id": 1021, "seek": 39678, "start": 3945.98, "end": 3950.98, "text": " So the lack of anything is perhaps concerning.", "tokens": [50824, 407, 264, 5011, 295, 1340, 307, 4317, 18087, 13, 51074], "temperature": 0.0, "avg_logprob": -0.18274155294193942, "compression_ratio": 1.4796380090497738, "no_speech_prob": 0.0008895110222510993}, {"id": 1022, "seek": 39678, "start": 3951.38, "end": 3954.1, "text": " And one story a bit similar,", "tokens": [51094, 400, 472, 1657, 257, 857, 2531, 11, 51230], "temperature": 0.0, "avg_logprob": -0.18274155294193942, "compression_ratio": 1.4796380090497738, "no_speech_prob": 0.0008895110222510993}, {"id": 1023, "seek": 39678, "start": 3954.1, "end": 3956.94, "text": " the story is Microsoft shakes up AI division", "tokens": [51230, 264, 1657, 307, 8116, 37891, 493, 7318, 10044, 51372], "temperature": 0.0, "avg_logprob": -0.18274155294193942, "compression_ratio": 1.4796380090497738, "no_speech_prob": 0.0008895110222510993}, {"id": 1024, "seek": 39678, "start": 3956.94, "end": 3959.54, "text": " as co-pilot falls behind Google", "tokens": [51372, 382, 598, 12, 79, 31516, 8804, 2261, 3329, 51502], "temperature": 0.0, "avg_logprob": -0.18274155294193942, "compression_ratio": 1.4796380090497738, "no_speech_prob": 0.0008895110222510993}, {"id": 1025, "seek": 39678, "start": 3959.54, "end": 3964.38, "text": " and open AI, they're reorganizing the AI division.", "tokens": [51502, 293, 1269, 7318, 11, 436, 434, 41203, 3319, 264, 7318, 10044, 13, 51744], "temperature": 0.0, "avg_logprob": -0.18274155294193942, "compression_ratio": 1.4796380090497738, "no_speech_prob": 0.0008895110222510993}, {"id": 1026, "seek": 42438, "start": 3964.38, "end": 3967.62, "text": " Gustaf Assumon is going to be shifting focus exclusively", "tokens": [50364, 32337, 2792, 6281, 449, 266, 307, 516, 281, 312, 17573, 1879, 20638, 50526], "temperature": 0.0, "avg_logprob": -0.2629743843686347, "compression_ratio": 1.549800796812749, "no_speech_prob": 0.037471141666173935}, {"id": 1027, "seek": 42438, "start": 3967.62, "end": 3970.94, "text": " to developing the company's own frontier foundation", "tokens": [50526, 281, 6416, 264, 2237, 311, 1065, 35853, 7030, 50692], "temperature": 0.0, "avg_logprob": -0.2629743843686347, "compression_ratio": 1.549800796812749, "no_speech_prob": 0.037471141666173935}, {"id": 1028, "seek": 42438, "start": 3970.94, "end": 3971.74, "text": " language model.", "tokens": [50692, 2856, 2316, 13, 50732], "temperature": 0.0, "avg_logprob": -0.2629743843686347, "compression_ratio": 1.549800796812749, "no_speech_prob": 0.037471141666173935}, {"id": 1029, "seek": 42438, "start": 3971.74, "end": 3974.46, "text": " So effectively, the same kind of focus", "tokens": [50732, 407, 8659, 11, 264, 912, 733, 295, 1879, 50868], "temperature": 0.0, "avg_logprob": -0.2629743843686347, "compression_ratio": 1.549800796812749, "no_speech_prob": 0.037471141666173935}, {"id": 1030, "seek": 42438, "start": 3974.46, "end": 3977.1, "text": " that we just discussed with Meta,", "tokens": [50868, 300, 321, 445, 7152, 365, 6377, 64, 11, 51000], "temperature": 0.0, "avg_logprob": -0.2629743843686347, "compression_ratio": 1.549800796812749, "no_speech_prob": 0.037471141666173935}, {"id": 1031, "seek": 42438, "start": 3977.1, "end": 3981.9, "text": " Microsoft has worked on training their own alternative", "tokens": [51000, 8116, 575, 2732, 322, 3097, 641, 1065, 8535, 51240], "temperature": 0.0, "avg_logprob": -0.2629743843686347, "compression_ratio": 1.549800796812749, "no_speech_prob": 0.037471141666173935}, {"id": 1032, "seek": 42438, "start": 3981.9, "end": 3983.94, "text": " to chat GPT and so on.", "tokens": [51240, 281, 5081, 26039, 51, 293, 370, 322, 13, 51342], "temperature": 0.0, "avg_logprob": -0.2629743843686347, "compression_ratio": 1.549800796812749, "no_speech_prob": 0.037471141666173935}, {"id": 1033, "seek": 42438, "start": 3983.94, "end": 3985.62, "text": " They've trained various models.", "tokens": [51342, 814, 600, 8895, 3683, 5245, 13, 51426], "temperature": 0.0, "avg_logprob": -0.2629743843686347, "compression_ratio": 1.549800796812749, "no_speech_prob": 0.037471141666173935}, {"id": 1034, "seek": 42438, "start": 3985.62, "end": 3988.94, "text": " I believe the big one from them was FI,", "tokens": [51426, 286, 1697, 264, 955, 472, 490, 552, 390, 479, 40, 11, 51592], "temperature": 0.0, "avg_logprob": -0.2629743843686347, "compression_ratio": 1.549800796812749, "no_speech_prob": 0.037471141666173935}, {"id": 1035, "seek": 42438, "start": 3988.94, "end": 3992.62, "text": " which is small models that they released.", "tokens": [51592, 597, 307, 1359, 5245, 300, 436, 4736, 13, 51776], "temperature": 0.0, "avg_logprob": -0.2629743843686347, "compression_ratio": 1.549800796812749, "no_speech_prob": 0.037471141666173935}, {"id": 1036, "seek": 45262, "start": 3992.7, "end": 3995.46, "text": " They do have a lot of users.", "tokens": [50368, 814, 360, 362, 257, 688, 295, 5022, 13, 50506], "temperature": 0.0, "avg_logprob": -0.16809468206606412, "compression_ratio": 1.6231155778894473, "no_speech_prob": 0.0007153072510845959}, {"id": 1037, "seek": 45262, "start": 3995.46, "end": 3997.94, "text": " So apparently, Microsoft's co-pilot app", "tokens": [50506, 407, 7970, 11, 8116, 311, 598, 12, 79, 31516, 724, 50630], "temperature": 0.0, "avg_logprob": -0.16809468206606412, "compression_ratio": 1.6231155778894473, "no_speech_prob": 0.0007153072510845959}, {"id": 1038, "seek": 45262, "start": 3997.94, "end": 4002.2200000000003, "text": " has 150 million monthly active users,", "tokens": [50630, 575, 8451, 2459, 12878, 4967, 5022, 11, 50844], "temperature": 0.0, "avg_logprob": -0.16809468206606412, "compression_ratio": 1.6231155778894473, "no_speech_prob": 0.0007153072510845959}, {"id": 1039, "seek": 45262, "start": 4002.2200000000003, "end": 4006.62, "text": " but if they want to compete with Gemini or chat GPT,", "tokens": [50844, 457, 498, 436, 528, 281, 11831, 365, 22894, 3812, 420, 5081, 26039, 51, 11, 51064], "temperature": 0.0, "avg_logprob": -0.16809468206606412, "compression_ratio": 1.6231155778894473, "no_speech_prob": 0.0007153072510845959}, {"id": 1040, "seek": 45262, "start": 4006.62, "end": 4010.06, "text": " they're nowhere near, they're way, way behind Gemini", "tokens": [51064, 436, 434, 11159, 2651, 11, 436, 434, 636, 11, 636, 2261, 22894, 3812, 51236], "temperature": 0.0, "avg_logprob": -0.16809468206606412, "compression_ratio": 1.6231155778894473, "no_speech_prob": 0.0007153072510845959}, {"id": 1041, "seek": 45262, "start": 4010.06, "end": 4013.34, "text": " has now 750 million active users,", "tokens": [51236, 575, 586, 31682, 2459, 4967, 5022, 11, 51400], "temperature": 0.0, "avg_logprob": -0.16809468206606412, "compression_ratio": 1.6231155778894473, "no_speech_prob": 0.0007153072510845959}, {"id": 1042, "seek": 45262, "start": 4013.34, "end": 4016.7, "text": " chat GPT has roughly 900 million,", "tokens": [51400, 5081, 26039, 51, 575, 9810, 22016, 2459, 11, 51568], "temperature": 0.0, "avg_logprob": -0.16809468206606412, "compression_ratio": 1.6231155778894473, "no_speech_prob": 0.0007153072510845959}, {"id": 1043, "seek": 45262, "start": 4016.7, "end": 4019.9, "text": " sorry, weekly active users versus co-pilot", "tokens": [51568, 2597, 11, 12460, 4967, 5022, 5717, 598, 12, 79, 31516, 51728], "temperature": 0.0, "avg_logprob": -0.16809468206606412, "compression_ratio": 1.6231155778894473, "no_speech_prob": 0.0007153072510845959}, {"id": 1044, "seek": 47990, "start": 4019.9, "end": 4024.18, "text": " has 150 million monthly active users.", "tokens": [50364, 575, 8451, 2459, 12878, 4967, 5022, 13, 50578], "temperature": 0.0, "avg_logprob": -0.16053237542510032, "compression_ratio": 1.6239669421487604, "no_speech_prob": 0.005065714940428734}, {"id": 1045, "seek": 47990, "start": 4024.18, "end": 4027.18, "text": " So yeah, that's the news, they are restructuring,", "tokens": [50578, 407, 1338, 11, 300, 311, 264, 2583, 11, 436, 366, 1472, 1757, 1345, 11, 50728], "temperature": 0.0, "avg_logprob": -0.16053237542510032, "compression_ratio": 1.6239669421487604, "no_speech_prob": 0.005065714940428734}, {"id": 1046, "seek": 47990, "start": 4027.18, "end": 4031.2200000000003, "text": " they are recognizing what they're following behind.", "tokens": [50728, 436, 366, 18538, 437, 436, 434, 3480, 2261, 13, 50930], "temperature": 0.0, "avg_logprob": -0.16053237542510032, "compression_ratio": 1.6239669421487604, "no_speech_prob": 0.005065714940428734}, {"id": 1047, "seek": 47990, "start": 4031.2200000000003, "end": 4035.2200000000003, "text": " And if they want to have their own frontier model,", "tokens": [50930, 400, 498, 436, 528, 281, 362, 641, 1065, 35853, 2316, 11, 51130], "temperature": 0.0, "avg_logprob": -0.16053237542510032, "compression_ratio": 1.6239669421487604, "no_speech_prob": 0.005065714940428734}, {"id": 1048, "seek": 47990, "start": 4035.2200000000003, "end": 4036.5, "text": " they are not there.", "tokens": [51130, 436, 366, 406, 456, 13, 51194], "temperature": 0.0, "avg_logprob": -0.16053237542510032, "compression_ratio": 1.6239669421487604, "no_speech_prob": 0.005065714940428734}, {"id": 1049, "seek": 47990, "start": 4036.5, "end": 4039.38, "text": " This is all the more glaring given the insane distribution", "tokens": [51194, 639, 307, 439, 264, 544, 1563, 1921, 2212, 264, 10838, 7316, 51338], "temperature": 0.0, "avg_logprob": -0.16053237542510032, "compression_ratio": 1.6239669421487604, "no_speech_prob": 0.005065714940428734}, {"id": 1050, "seek": 47990, "start": 4039.38, "end": 4041.14, "text": " advantage that Microsoft has.", "tokens": [51338, 5002, 300, 8116, 575, 13, 51426], "temperature": 0.0, "avg_logprob": -0.16053237542510032, "compression_ratio": 1.6239669421487604, "no_speech_prob": 0.005065714940428734}, {"id": 1051, "seek": 47990, "start": 4041.14, "end": 4044.82, "text": " Distribution is supposed to win these kinds of battles.", "tokens": [51426, 9840, 30783, 307, 3442, 281, 1942, 613, 3685, 295, 14648, 13, 51610], "temperature": 0.0, "avg_logprob": -0.16053237542510032, "compression_ratio": 1.6239669421487604, "no_speech_prob": 0.005065714940428734}, {"id": 1052, "seek": 47990, "start": 4044.82, "end": 4046.46, "text": " There's a reason that Microsoft teams", "tokens": [51610, 821, 311, 257, 1778, 300, 8116, 5491, 51692], "temperature": 0.0, "avg_logprob": -0.16053237542510032, "compression_ratio": 1.6239669421487604, "no_speech_prob": 0.005065714940428734}, {"id": 1053, "seek": 50646, "start": 4046.46, "end": 4049.9, "text": " kick the shit out of slack, like within minutes of launching.", "tokens": [50364, 4437, 264, 4611, 484, 295, 29767, 11, 411, 1951, 2077, 295, 18354, 13, 50536], "temperature": 0.0, "avg_logprob": -0.17749334965552482, "compression_ratio": 1.6587537091988132, "no_speech_prob": 0.01388704963028431}, {"id": 1054, "seek": 50646, "start": 4049.9, "end": 4051.1800000000003, "text": " And it's because they had distribution.", "tokens": [50536, 400, 309, 311, 570, 436, 632, 7316, 13, 50600], "temperature": 0.0, "avg_logprob": -0.17749334965552482, "compression_ratio": 1.6587537091988132, "no_speech_prob": 0.01388704963028431}, {"id": 1055, "seek": 50646, "start": 4051.1800000000003, "end": 4053.2200000000003, "text": " Microsoft is in every system, right?", "tokens": [50600, 8116, 307, 294, 633, 1185, 11, 558, 30, 50702], "temperature": 0.0, "avg_logprob": -0.17749334965552482, "compression_ratio": 1.6587537091988132, "no_speech_prob": 0.01388704963028431}, {"id": 1056, "seek": 50646, "start": 4053.2200000000003, "end": 4054.66, "text": " And yet you have like admittedly,", "tokens": [50702, 400, 1939, 291, 362, 411, 14920, 356, 11, 50774], "temperature": 0.0, "avg_logprob": -0.17749334965552482, "compression_ratio": 1.6587537091988132, "no_speech_prob": 0.01388704963028431}, {"id": 1057, "seek": 50646, "start": 4054.66, "end": 4057.1, "text": " okay, so Google is probably a bad comparison", "tokens": [50774, 1392, 11, 370, 3329, 307, 1391, 257, 1578, 9660, 50896], "temperature": 0.0, "avg_logprob": -0.17749334965552482, "compression_ratio": 1.6587537091988132, "no_speech_prob": 0.01388704963028431}, {"id": 1058, "seek": 50646, "start": 4057.1, "end": 4059.62, "text": " because they have a similar, insane distribution advantage.", "tokens": [50896, 570, 436, 362, 257, 2531, 11, 10838, 7316, 5002, 13, 51022], "temperature": 0.0, "avg_logprob": -0.17749334965552482, "compression_ratio": 1.6587537091988132, "no_speech_prob": 0.01388704963028431}, {"id": 1059, "seek": 50646, "start": 4059.62, "end": 4061.66, "text": " But when you look at, you know, opening eyes models", "tokens": [51022, 583, 562, 291, 574, 412, 11, 291, 458, 11, 5193, 2575, 5245, 51124], "temperature": 0.0, "avg_logprob": -0.17749334965552482, "compression_ratio": 1.6587537091988132, "no_speech_prob": 0.01388704963028431}, {"id": 1060, "seek": 50646, "start": 4061.66, "end": 4063.42, "text": " and in anthropics models,", "tokens": [51124, 293, 294, 22727, 1167, 5245, 11, 51212], "temperature": 0.0, "avg_logprob": -0.17749334965552482, "compression_ratio": 1.6587537091988132, "no_speech_prob": 0.01388704963028431}, {"id": 1061, "seek": 50646, "start": 4063.42, "end": 4065.06, "text": " not being able to compete with those", "tokens": [51212, 406, 885, 1075, 281, 11831, 365, 729, 51294], "temperature": 0.0, "avg_logprob": -0.17749334965552482, "compression_ratio": 1.6587537091988132, "no_speech_prob": 0.01388704963028431}, {"id": 1062, "seek": 50646, "start": 4065.06, "end": 4067.42, "text": " is a really kind of bad indication here.", "tokens": [51294, 307, 257, 534, 733, 295, 1578, 18877, 510, 13, 51412], "temperature": 0.0, "avg_logprob": -0.17749334965552482, "compression_ratio": 1.6587537091988132, "no_speech_prob": 0.01388704963028431}, {"id": 1063, "seek": 50646, "start": 4067.42, "end": 4070.38, "text": " So yeah, I mean, no surprise, something like this", "tokens": [51412, 407, 1338, 11, 286, 914, 11, 572, 6365, 11, 746, 411, 341, 51560], "temperature": 0.0, "avg_logprob": -0.17749334965552482, "compression_ratio": 1.6587537091988132, "no_speech_prob": 0.01388704963028431}, {"id": 1064, "seek": 50646, "start": 4070.38, "end": 4072.54, "text": " comes with a requirement to shift.", "tokens": [51560, 1487, 365, 257, 11695, 281, 5513, 13, 51668], "temperature": 0.0, "avg_logprob": -0.17749334965552482, "compression_ratio": 1.6587537091988132, "no_speech_prob": 0.01388704963028431}, {"id": 1065, "seek": 50646, "start": 4072.54, "end": 4074.34, "text": " The structural dependence on opening eye", "tokens": [51668, 440, 15067, 31704, 322, 5193, 3313, 51758], "temperature": 0.0, "avg_logprob": -0.17749334965552482, "compression_ratio": 1.6587537091988132, "no_speech_prob": 0.01388704963028431}, {"id": 1066, "seek": 53434, "start": 4074.34, "end": 4075.54, "text": " has been a big issue.", "tokens": [50364, 575, 668, 257, 955, 2734, 13, 50424], "temperature": 0.0, "avg_logprob": -0.13857060208728042, "compression_ratio": 1.676300578034682, "no_speech_prob": 0.002465990372002125}, {"id": 1067, "seek": 53434, "start": 4075.54, "end": 4077.94, "text": " You know, Microsoft has that 27% stake", "tokens": [50424, 509, 458, 11, 8116, 575, 300, 7634, 4, 10407, 50544], "temperature": 0.0, "avg_logprob": -0.13857060208728042, "compression_ratio": 1.676300578034682, "no_speech_prob": 0.002465990372002125}, {"id": 1068, "seek": 53434, "start": 4077.94, "end": 4080.94, "text": " in at least the profit for profit unit,", "tokens": [50544, 294, 412, 1935, 264, 7475, 337, 7475, 4985, 11, 50694], "temperature": 0.0, "avg_logprob": -0.13857060208728042, "compression_ratio": 1.676300578034682, "no_speech_prob": 0.002465990372002125}, {"id": 1069, "seek": 53434, "start": 4080.94, "end": 4083.62, "text": " not going to get into the dissection of opening eye", "tokens": [50694, 406, 516, 281, 483, 666, 264, 717, 11963, 295, 5193, 3313, 50828], "temperature": 0.0, "avg_logprob": -0.13857060208728042, "compression_ratio": 1.676300578034682, "no_speech_prob": 0.002465990372002125}, {"id": 1070, "seek": 53434, "start": 4083.62, "end": 4084.98, "text": " in their corporate structure.", "tokens": [50828, 294, 641, 10896, 3877, 13, 50896], "temperature": 0.0, "avg_logprob": -0.13857060208728042, "compression_ratio": 1.676300578034682, "no_speech_prob": 0.002465990372002125}, {"id": 1071, "seek": 53434, "start": 4084.98, "end": 4087.94, "text": " But again, but bottom line is that in a sense", "tokens": [50896, 583, 797, 11, 457, 2767, 1622, 307, 300, 294, 257, 2020, 51044], "temperature": 0.0, "avg_logprob": -0.13857060208728042, "compression_ratio": 1.676300578034682, "no_speech_prob": 0.002465990372002125}, {"id": 1072, "seek": 53434, "start": 4087.94, "end": 4090.58, "text": " has been maybe a, I don't mean call it a security blanket.", "tokens": [51044, 575, 668, 1310, 257, 11, 286, 500, 380, 914, 818, 309, 257, 3825, 17907, 13, 51176], "temperature": 0.0, "avg_logprob": -0.13857060208728042, "compression_ratio": 1.676300578034682, "no_speech_prob": 0.002465990372002125}, {"id": 1073, "seek": 53434, "start": 4090.58, "end": 4093.46, "text": " It's increasingly clear that they're competing directly", "tokens": [51176, 467, 311, 12980, 1850, 300, 436, 434, 15439, 3838, 51320], "temperature": 0.0, "avg_logprob": -0.13857060208728042, "compression_ratio": 1.676300578034682, "no_speech_prob": 0.002465990372002125}, {"id": 1074, "seek": 53434, "start": 4093.46, "end": 4095.5, "text": " with opening eye for a bunch of enterprise customers.", "tokens": [51320, 365, 5193, 3313, 337, 257, 3840, 295, 14132, 4581, 13, 51422], "temperature": 0.0, "avg_logprob": -0.13857060208728042, "compression_ratio": 1.676300578034682, "no_speech_prob": 0.002465990372002125}, {"id": 1075, "seek": 53434, "start": 4095.5, "end": 4097.98, "text": " So cannibalizing themselves really from the inside,", "tokens": [51422, 407, 12361, 34764, 3319, 2969, 534, 490, 264, 1854, 11, 51546], "temperature": 0.0, "avg_logprob": -0.13857060208728042, "compression_ratio": 1.676300578034682, "no_speech_prob": 0.002465990372002125}, {"id": 1076, "seek": 53434, "start": 4097.98, "end": 4098.82, "text": " that's a problem.", "tokens": [51546, 300, 311, 257, 1154, 13, 51588], "temperature": 0.0, "avg_logprob": -0.13857060208728042, "compression_ratio": 1.676300578034682, "no_speech_prob": 0.002465990372002125}, {"id": 1077, "seek": 53434, "start": 4098.82, "end": 4100.14, "text": " That's a really big problem.", "tokens": [51588, 663, 311, 257, 534, 955, 1154, 13, 51654], "temperature": 0.0, "avg_logprob": -0.13857060208728042, "compression_ratio": 1.676300578034682, "no_speech_prob": 0.002465990372002125}, {"id": 1078, "seek": 53434, "start": 4100.14, "end": 4102.46, "text": " Microsoft has a lot more to lose in that sense.", "tokens": [51654, 8116, 575, 257, 688, 544, 281, 3624, 294, 300, 2020, 13, 51770], "temperature": 0.0, "avg_logprob": -0.13857060208728042, "compression_ratio": 1.676300578034682, "no_speech_prob": 0.002465990372002125}, {"id": 1079, "seek": 53434, "start": 4102.46, "end": 4103.82, "text": " And yeah, so they got to find a way", "tokens": [51770, 400, 1338, 11, 370, 436, 658, 281, 915, 257, 636, 51838], "temperature": 0.0, "avg_logprob": -0.13857060208728042, "compression_ratio": 1.676300578034682, "no_speech_prob": 0.002465990372002125}, {"id": 1080, "seek": 56382, "start": 4103.98, "end": 4105.3, "text": " to turn this around right quick", "tokens": [50372, 281, 1261, 341, 926, 558, 1702, 50438], "temperature": 0.0, "avg_logprob": -0.21483220885285234, "compression_ratio": 1.5761316872427984, "no_speech_prob": 0.002957363147288561}, {"id": 1081, "seek": 56382, "start": 4105.3, "end": 4107.34, "text": " if they think AI's headed where it might be.", "tokens": [50438, 498, 436, 519, 7318, 311, 12798, 689, 309, 1062, 312, 13, 50540], "temperature": 0.0, "avg_logprob": -0.21483220885285234, "compression_ratio": 1.5761316872427984, "no_speech_prob": 0.002957363147288561}, {"id": 1082, "seek": 56382, "start": 4107.34, "end": 4108.78, "text": " Quick recap by the way.", "tokens": [50540, 12101, 20928, 538, 264, 636, 13, 50612], "temperature": 0.0, "avg_logprob": -0.21483220885285234, "compression_ratio": 1.5761316872427984, "no_speech_prob": 0.002957363147288561}, {"id": 1083, "seek": 56382, "start": 4108.78, "end": 4113.86, "text": " So Mustafa Suleiman, former co-founder of DeepMind,", "tokens": [50612, 407, 37229, 318, 2271, 25504, 11, 5819, 598, 12, 33348, 295, 14895, 44, 471, 11, 50866], "temperature": 0.0, "avg_logprob": -0.21483220885285234, "compression_ratio": 1.5761316872427984, "no_speech_prob": 0.002957363147288561}, {"id": 1084, "seek": 56382, "start": 4113.86, "end": 4116.82, "text": " joined Microsoft, I think last year,", "tokens": [50866, 6869, 8116, 11, 286, 519, 1036, 1064, 11, 51014], "temperature": 0.0, "avg_logprob": -0.21483220885285234, "compression_ratio": 1.5761316872427984, "no_speech_prob": 0.002957363147288561}, {"id": 1085, "seek": 56382, "start": 4116.82, "end": 4120.82, "text": " has been their CEO of Microsoft AI.", "tokens": [51014, 575, 668, 641, 9282, 295, 8116, 7318, 13, 51214], "temperature": 0.0, "avg_logprob": -0.21483220885285234, "compression_ratio": 1.5761316872427984, "no_speech_prob": 0.002957363147288561}, {"id": 1086, "seek": 56382, "start": 4120.82, "end": 4123.74, "text": " And this whole story is based at these partially", "tokens": [51214, 400, 341, 1379, 1657, 307, 2361, 412, 613, 18886, 51360], "temperature": 0.0, "avg_logprob": -0.21483220885285234, "compression_ratio": 1.5761316872427984, "no_speech_prob": 0.002957363147288561}, {"id": 1087, "seek": 56382, "start": 4123.74, "end": 4126.66, "text": " on MMO, he released a titled", "tokens": [51360, 322, 376, 18976, 11, 415, 4736, 257, 19841, 51506], "temperature": 0.0, "avg_logprob": -0.21483220885285234, "compression_ratio": 1.5761316872427984, "no_speech_prob": 0.002957363147288561}, {"id": 1088, "seek": 56382, "start": 4126.66, "end": 4129.5, "text": " a new structure for Microsoft AI.", "tokens": [51506, 257, 777, 3877, 337, 8116, 7318, 13, 51648], "temperature": 0.0, "avg_logprob": -0.21483220885285234, "compression_ratio": 1.5761316872427984, "no_speech_prob": 0.002957363147288561}, {"id": 1089, "seek": 56382, "start": 4129.5, "end": 4133.02, "text": " And the gist of it is, again, similar to meta", "tokens": [51648, 400, 264, 290, 468, 295, 309, 307, 11, 797, 11, 2531, 281, 19616, 51824], "temperature": 0.0, "avg_logprob": -0.21483220885285234, "compression_ratio": 1.5761316872427984, "no_speech_prob": 0.002957363147288561}, {"id": 1090, "seek": 59302, "start": 4133.02, "end": 4136.0599999999995, "text": " that he wants to pursue super intelligence,", "tokens": [50364, 300, 415, 2738, 281, 12392, 1687, 7599, 11, 50516], "temperature": 0.0, "avg_logprob": -0.1391127542788241, "compression_ratio": 1.70703125, "no_speech_prob": 0.0009968917584046721}, {"id": 1091, "seek": 59302, "start": 4136.0599999999995, "end": 4139.3, "text": " consumer things and product considerations", "tokens": [50516, 9711, 721, 293, 1674, 24070, 50678], "temperature": 0.0, "avg_logprob": -0.1391127542788241, "compression_ratio": 1.70703125, "no_speech_prob": 0.0009968917584046721}, {"id": 1092, "seek": 59302, "start": 4139.3, "end": 4140.74, "text": " get in a way of that.", "tokens": [50678, 483, 294, 257, 636, 295, 300, 13, 50750], "temperature": 0.0, "avg_logprob": -0.1391127542788241, "compression_ratio": 1.70703125, "no_speech_prob": 0.0009968917584046721}, {"id": 1093, "seek": 59302, "start": 4140.74, "end": 4144.14, "text": " So he's going to be freed up to focus on that.", "tokens": [50750, 407, 415, 311, 516, 281, 312, 21796, 493, 281, 1879, 322, 300, 13, 50920], "temperature": 0.0, "avg_logprob": -0.1391127542788241, "compression_ratio": 1.70703125, "no_speech_prob": 0.0009968917584046721}, {"id": 1094, "seek": 59302, "start": 4144.14, "end": 4147.9, "text": " Jacob Andru, former senior vice president at SNAP", "tokens": [50920, 14117, 400, 894, 11, 5819, 7965, 11964, 3868, 412, 13955, 4715, 51108], "temperature": 0.0, "avg_logprob": -0.1391127542788241, "compression_ratio": 1.70703125, "no_speech_prob": 0.0009968917584046721}, {"id": 1095, "seek": 59302, "start": 4147.9, "end": 4150.22, "text": " will take over as executive vice president", "tokens": [51108, 486, 747, 670, 382, 10140, 11964, 3868, 51224], "temperature": 0.0, "avg_logprob": -0.1391127542788241, "compression_ratio": 1.70703125, "no_speech_prob": 0.0009968917584046721}, {"id": 1096, "seek": 59302, "start": 4150.22, "end": 4152.38, "text": " leading the co-pilot division.", "tokens": [51224, 5775, 264, 598, 12, 79, 31516, 10044, 13, 51332], "temperature": 0.0, "avg_logprob": -0.1391127542788241, "compression_ratio": 1.70703125, "no_speech_prob": 0.0009968917584046721}, {"id": 1097, "seek": 59302, "start": 4152.38, "end": 4155.54, "text": " So there's a split here where there's co-pilot,", "tokens": [51332, 407, 456, 311, 257, 7472, 510, 689, 456, 311, 598, 12, 79, 31516, 11, 51490], "temperature": 0.0, "avg_logprob": -0.1391127542788241, "compression_ratio": 1.70703125, "no_speech_prob": 0.0009968917584046721}, {"id": 1098, "seek": 59302, "start": 4155.54, "end": 4158.02, "text": " the product, the app, et cetera.", "tokens": [51490, 264, 1674, 11, 264, 724, 11, 1030, 11458, 13, 51614], "temperature": 0.0, "avg_logprob": -0.1391127542788241, "compression_ratio": 1.70703125, "no_speech_prob": 0.0009968917584046721}, {"id": 1099, "seek": 59302, "start": 4158.02, "end": 4160.02, "text": " Someone else focuses on that.", "tokens": [51614, 8734, 1646, 16109, 322, 300, 13, 51714], "temperature": 0.0, "avg_logprob": -0.1391127542788241, "compression_ratio": 1.70703125, "no_speech_prob": 0.0009968917584046721}, {"id": 1100, "seek": 59302, "start": 4160.02, "end": 4162.98, "text": " Mustafa focuses on building the frontier model", "tokens": [51714, 37229, 16109, 322, 2390, 264, 35853, 2316, 51862], "temperature": 0.0, "avg_logprob": -0.1391127542788241, "compression_ratio": 1.70703125, "no_speech_prob": 0.0009968917584046721}, {"id": 1101, "seek": 62298, "start": 4162.98, "end": 4164.86, "text": " on getting through super intelligence.", "tokens": [50364, 322, 1242, 807, 1687, 7599, 13, 50458], "temperature": 0.0, "avg_logprob": -0.1990052063937311, "compression_ratio": 1.730886850152905, "no_speech_prob": 0.002261023037135601}, {"id": 1102, "seek": 62298, "start": 4164.86, "end": 4167.26, "text": " They are much like how we just discussed", "tokens": [50458, 814, 366, 709, 411, 577, 321, 445, 7152, 50578], "temperature": 0.0, "avg_logprob": -0.1990052063937311, "compression_ratio": 1.730886850152905, "no_speech_prob": 0.002261023037135601}, {"id": 1103, "seek": 62298, "start": 4167.26, "end": 4169.5, "text": " with Alexander Wang at meta.", "tokens": [50578, 365, 14845, 14499, 412, 19616, 13, 50690], "temperature": 0.0, "avg_logprob": -0.1990052063937311, "compression_ratio": 1.730886850152905, "no_speech_prob": 0.002261023037135601}, {"id": 1104, "seek": 62298, "start": 4169.5, "end": 4170.42, "text": " Well, it's your point.", "tokens": [50690, 1042, 11, 309, 311, 428, 935, 13, 50736], "temperature": 0.0, "avg_logprob": -0.1990052063937311, "compression_ratio": 1.730886850152905, "no_speech_prob": 0.002261023037135601}, {"id": 1105, "seek": 62298, "start": 4170.42, "end": 4172.5, "text": " We're seeing this same motif over and over, right?", "tokens": [50736, 492, 434, 2577, 341, 912, 39478, 670, 293, 670, 11, 558, 30, 50840], "temperature": 0.0, "avg_logprob": -0.1990052063937311, "compression_ratio": 1.730886850152905, "no_speech_prob": 0.002261023037135601}, {"id": 1106, "seek": 62298, "start": 4172.5, "end": 4174.46, "text": " It's not just meta, it's not just Microsoft.", "tokens": [50840, 467, 311, 406, 445, 19616, 11, 309, 311, 406, 445, 8116, 13, 50938], "temperature": 0.0, "avg_logprob": -0.1990052063937311, "compression_ratio": 1.730886850152905, "no_speech_prob": 0.002261023037135601}, {"id": 1107, "seek": 62298, "start": 4174.46, "end": 4177.1, "text": " It's opening AI, I mean, it's happening at Google,", "tokens": [50938, 467, 311, 5193, 7318, 11, 286, 914, 11, 309, 311, 2737, 412, 3329, 11, 51070], "temperature": 0.0, "avg_logprob": -0.1990052063937311, "compression_ratio": 1.730886850152905, "no_speech_prob": 0.002261023037135601}, {"id": 1108, "seek": 62298, "start": 4177.1, "end": 4178.42, "text": " like basically the question is,", "tokens": [51070, 411, 1936, 264, 1168, 307, 11, 51136], "temperature": 0.0, "avg_logprob": -0.1990052063937311, "compression_ratio": 1.730886850152905, "no_speech_prob": 0.002261023037135601}, {"id": 1109, "seek": 62298, "start": 4178.42, "end": 4180.9, "text": " how seriously do you take the potential", "tokens": [51136, 577, 6638, 360, 291, 747, 264, 3995, 51260], "temperature": 0.0, "avg_logprob": -0.1990052063937311, "compression_ratio": 1.730886850152905, "no_speech_prob": 0.002261023037135601}, {"id": 1110, "seek": 62298, "start": 4180.9, "end": 4182.58, "text": " imminent arrival of super intelligence", "tokens": [51260, 44339, 18365, 295, 1687, 7599, 51344], "temperature": 0.0, "avg_logprob": -0.1990052063937311, "compression_ratio": 1.730886850152905, "no_speech_prob": 0.002261023037135601}, {"id": 1111, "seek": 62298, "start": 4182.58, "end": 4184.26, "text": " if you take it very seriously?", "tokens": [51344, 498, 291, 747, 309, 588, 6638, 30, 51428], "temperature": 0.0, "avg_logprob": -0.1990052063937311, "compression_ratio": 1.730886850152905, "no_speech_prob": 0.002261023037135601}, {"id": 1112, "seek": 62298, "start": 4184.26, "end": 4186.06, "text": " You got to drop everything, work on that shit, man.", "tokens": [51428, 509, 658, 281, 3270, 1203, 11, 589, 322, 300, 4611, 11, 587, 13, 51518], "temperature": 0.0, "avg_logprob": -0.1990052063937311, "compression_ratio": 1.730886850152905, "no_speech_prob": 0.002261023037135601}, {"id": 1113, "seek": 62298, "start": 4186.06, "end": 4187.86, "text": " That's what Alex Wang is saying.", "tokens": [51518, 663, 311, 437, 5202, 14499, 307, 1566, 13, 51608], "temperature": 0.0, "avg_logprob": -0.1990052063937311, "compression_ratio": 1.730886850152905, "no_speech_prob": 0.002261023037135601}, {"id": 1114, "seek": 62298, "start": 4187.86, "end": 4191.9, "text": " And clearly, Microsoft is not immune to that tension, right?", "tokens": [51608, 400, 4448, 11, 8116, 307, 406, 11992, 281, 300, 8980, 11, 558, 30, 51810], "temperature": 0.0, "avg_logprob": -0.1990052063937311, "compression_ratio": 1.730886850152905, "no_speech_prob": 0.002261023037135601}, {"id": 1115, "seek": 65190, "start": 4191.98, "end": 4196.0599999999995, "text": " How much do you view AI as like a commercial product question", "tokens": [50368, 1012, 709, 360, 291, 1910, 7318, 382, 411, 257, 6841, 1674, 1168, 50572], "temperature": 0.0, "avg_logprob": -0.17913315349664444, "compression_ratio": 1.755700325732899, "no_speech_prob": 0.0034505166113376617}, {"id": 1116, "seek": 65190, "start": 4196.0599999999995, "end": 4197.62, "text": " and how much do you view it as,", "tokens": [50572, 293, 577, 709, 360, 291, 1910, 309, 382, 11, 50650], "temperature": 0.0, "avg_logprob": -0.17913315349664444, "compression_ratio": 1.755700325732899, "no_speech_prob": 0.0034505166113376617}, {"id": 1117, "seek": 65190, "start": 4197.62, "end": 4199.3, "text": " this is the thing that's triggering the singularity,", "tokens": [50650, 341, 307, 264, 551, 300, 311, 40406, 264, 20010, 507, 11, 50734], "temperature": 0.0, "avg_logprob": -0.17913315349664444, "compression_ratio": 1.755700325732899, "no_speech_prob": 0.0034505166113376617}, {"id": 1118, "seek": 65190, "start": 4199.3, "end": 4201.34, "text": " holy shit, like the one place,", "tokens": [50734, 10622, 4611, 11, 411, 264, 472, 1081, 11, 50836], "temperature": 0.0, "avg_logprob": -0.17913315349664444, "compression_ratio": 1.755700325732899, "no_speech_prob": 0.0034505166113376617}, {"id": 1119, "seek": 65190, "start": 4201.34, "end": 4203.22, "text": " you don't tend to see that kind of tension,", "tokens": [50836, 291, 500, 380, 3928, 281, 536, 300, 733, 295, 8980, 11, 50930], "temperature": 0.0, "avg_logprob": -0.17913315349664444, "compression_ratio": 1.755700325732899, "no_speech_prob": 0.0034505166113376617}, {"id": 1120, "seek": 65190, "start": 4203.22, "end": 4205.86, "text": " or be careful, it exists.", "tokens": [50930, 420, 312, 5026, 11, 309, 8198, 13, 51062], "temperature": 0.0, "avg_logprob": -0.17913315349664444, "compression_ratio": 1.755700325732899, "no_speech_prob": 0.0034505166113376617}, {"id": 1121, "seek": 65190, "start": 4205.86, "end": 4208.5, "text": " But like, you know, like you see some of it,", "tokens": [51062, 583, 411, 11, 291, 458, 11, 411, 291, 536, 512, 295, 309, 11, 51194], "temperature": 0.0, "avg_logprob": -0.17913315349664444, "compression_ratio": 1.755700325732899, "no_speech_prob": 0.0034505166113376617}, {"id": 1122, "seek": 65190, "start": 4208.5, "end": 4209.62, "text": " I should say in the media,", "tokens": [51194, 286, 820, 584, 294, 264, 3021, 11, 51250], "temperature": 0.0, "avg_logprob": -0.17913315349664444, "compression_ratio": 1.755700325732899, "no_speech_prob": 0.0034505166113376617}, {"id": 1123, "seek": 65190, "start": 4209.62, "end": 4211.86, "text": " but you don't see as much of it as anthropic.", "tokens": [51250, 457, 291, 500, 380, 536, 382, 709, 295, 309, 382, 22727, 299, 13, 51362], "temperature": 0.0, "avg_logprob": -0.17913315349664444, "compression_ratio": 1.755700325732899, "no_speech_prob": 0.0034505166113376617}, {"id": 1124, "seek": 65190, "start": 4211.86, "end": 4213.66, "text": " Like it seems like generally anthropic", "tokens": [51362, 1743, 309, 2544, 411, 5101, 22727, 299, 51452], "temperature": 0.0, "avg_logprob": -0.17913315349664444, "compression_ratio": 1.755700325732899, "no_speech_prob": 0.0034505166113376617}, {"id": 1125, "seek": 65190, "start": 4213.66, "end": 4216.98, "text": " and its public communications is much more aligned", "tokens": [51452, 293, 1080, 1908, 15163, 307, 709, 544, 17962, 51618], "temperature": 0.0, "avg_logprob": -0.17913315349664444, "compression_ratio": 1.755700325732899, "no_speech_prob": 0.0034505166113376617}, {"id": 1126, "seek": 65190, "start": 4216.98, "end": 4219.18, "text": " around the idea of like where they're going.", "tokens": [51618, 926, 264, 1558, 295, 411, 689, 436, 434, 516, 13, 51728], "temperature": 0.0, "avg_logprob": -0.17913315349664444, "compression_ratio": 1.755700325732899, "no_speech_prob": 0.0034505166113376617}, {"id": 1127, "seek": 65190, "start": 4219.18, "end": 4221.1, "text": " So, you know, do it that way you will,", "tokens": [51728, 407, 11, 291, 458, 11, 360, 309, 300, 636, 291, 486, 11, 51824], "temperature": 0.0, "avg_logprob": -0.17913315349664444, "compression_ratio": 1.755700325732899, "no_speech_prob": 0.0034505166113376617}, {"id": 1128, "seek": 68110, "start": 4221.1, "end": 4223.1, "text": " but it definitely is notable", "tokens": [50364, 457, 309, 2138, 307, 22556, 50464], "temperature": 0.0, "avg_logprob": -0.15807751907366457, "compression_ratio": 1.6037735849056605, "no_speech_prob": 0.0005359530914574862}, {"id": 1129, "seek": 68110, "start": 4223.1, "end": 4224.1, "text": " that basically everybody else", "tokens": [50464, 300, 1936, 2201, 1646, 50514], "temperature": 0.0, "avg_logprob": -0.15807751907366457, "compression_ratio": 1.6037735849056605, "no_speech_prob": 0.0005359530914574862}, {"id": 1130, "seek": 68110, "start": 4224.1, "end": 4226.66, "text": " is having these large public spots about this.", "tokens": [50514, 307, 1419, 613, 2416, 1908, 10681, 466, 341, 13, 50642], "temperature": 0.0, "avg_logprob": -0.15807751907366457, "compression_ratio": 1.6037735849056605, "no_speech_prob": 0.0005359530914574862}, {"id": 1131, "seek": 68110, "start": 4226.66, "end": 4228.9400000000005, "text": " And moving on to policy and safety,", "tokens": [50642, 400, 2684, 322, 281, 3897, 293, 4514, 11, 50756], "temperature": 0.0, "avg_logprob": -0.15807751907366457, "compression_ratio": 1.6037735849056605, "no_speech_prob": 0.0005359530914574862}, {"id": 1132, "seek": 68110, "start": 4228.9400000000005, "end": 4232.7, "text": " where we have quite a bit of research related to safety.", "tokens": [50756, 689, 321, 362, 1596, 257, 857, 295, 2132, 4077, 281, 4514, 13, 50944], "temperature": 0.0, "avg_logprob": -0.15807751907366457, "compression_ratio": 1.6037735849056605, "no_speech_prob": 0.0005359530914574862}, {"id": 1133, "seek": 68110, "start": 4232.7, "end": 4236.38, "text": " The first one is decision theoretic formalization", "tokens": [50944, 440, 700, 472, 307, 3537, 14308, 299, 9860, 2144, 51128], "temperature": 0.0, "avg_logprob": -0.15807751907366457, "compression_ratio": 1.6037735849056605, "no_speech_prob": 0.0005359530914574862}, {"id": 1134, "seek": 68110, "start": 4236.38, "end": 4240.66, "text": " of stenography with application to LLM monitoring.", "tokens": [51128, 295, 28031, 5820, 365, 3861, 281, 441, 43, 44, 11028, 13, 51342], "temperature": 0.0, "avg_logprob": -0.15807751907366457, "compression_ratio": 1.6037735849056605, "no_speech_prob": 0.0005359530914574862}, {"id": 1135, "seek": 68110, "start": 4240.66, "end": 4245.34, "text": " So, stenography is a practice of concealing secret information", "tokens": [51342, 407, 11, 28031, 5820, 307, 257, 3124, 295, 10413, 4270, 4054, 1589, 51576], "temperature": 0.0, "avg_logprob": -0.15807751907366457, "compression_ratio": 1.6037735849056605, "no_speech_prob": 0.0005359530914574862}, {"id": 1136, "seek": 68110, "start": 4245.34, "end": 4248.02, "text": " within non-secret data.", "tokens": [51576, 1951, 2107, 12, 8159, 1505, 1412, 13, 51710], "temperature": 0.0, "avg_logprob": -0.15807751907366457, "compression_ratio": 1.6037735849056605, "no_speech_prob": 0.0005359530914574862}, {"id": 1137, "seek": 68110, "start": 4248.02, "end": 4249.82, "text": " We've seen this typically with images,", "tokens": [51710, 492, 600, 1612, 341, 5850, 365, 5267, 11, 51800], "temperature": 0.0, "avg_logprob": -0.15807751907366457, "compression_ratio": 1.6037735849056605, "no_speech_prob": 0.0005359530914574862}, {"id": 1138, "seek": 70982, "start": 4249.82, "end": 4252.98, "text": " but you can also do it with text.", "tokens": [50364, 457, 291, 393, 611, 360, 309, 365, 2487, 13, 50522], "temperature": 0.0, "avg_logprob": -0.20169728042350876, "compression_ratio": 1.6027397260273972, "no_speech_prob": 0.0002531026257202029}, {"id": 1139, "seek": 70982, "start": 4252.98, "end": 4255.74, "text": " And as far as LLM monitoring, right?", "tokens": [50522, 400, 382, 1400, 382, 441, 43, 44, 11028, 11, 558, 30, 50660], "temperature": 0.0, "avg_logprob": -0.20169728042350876, "compression_ratio": 1.6027397260273972, "no_speech_prob": 0.0002531026257202029}, {"id": 1140, "seek": 70982, "start": 4255.74, "end": 4260.66, "text": " If you have a malicious AI that you want to catch,", "tokens": [50660, 759, 291, 362, 257, 33496, 7318, 300, 291, 528, 281, 3745, 11, 50906], "temperature": 0.0, "avg_logprob": -0.20169728042350876, "compression_ratio": 1.6027397260273972, "no_speech_prob": 0.0002531026257202029}, {"id": 1141, "seek": 70982, "start": 4260.66, "end": 4262.42, "text": " if it's trying to do something bad,", "tokens": [50906, 498, 309, 311, 1382, 281, 360, 746, 1578, 11, 50994], "temperature": 0.0, "avg_logprob": -0.20169728042350876, "compression_ratio": 1.6027397260273972, "no_speech_prob": 0.0002531026257202029}, {"id": 1142, "seek": 70982, "start": 4262.42, "end": 4265.22, "text": " or kind of a monitor's communications", "tokens": [50994, 420, 733, 295, 257, 6002, 311, 15163, 51134], "temperature": 0.0, "avg_logprob": -0.20169728042350876, "compression_ratio": 1.6027397260273972, "no_speech_prob": 0.0002531026257202029}, {"id": 1143, "seek": 70982, "start": 4265.22, "end": 4267.26, "text": " to know what it's doing,", "tokens": [51134, 281, 458, 437, 309, 311, 884, 11, 51236], "temperature": 0.0, "avg_logprob": -0.20169728042350876, "compression_ratio": 1.6027397260273972, "no_speech_prob": 0.0002531026257202029}, {"id": 1144, "seek": 70982, "start": 4267.26, "end": 4269.9400000000005, "text": " potentially it could try to get around", "tokens": [51236, 7263, 309, 727, 853, 281, 483, 926, 51370], "temperature": 0.0, "avg_logprob": -0.20169728042350876, "compression_ratio": 1.6027397260273972, "no_speech_prob": 0.0002531026257202029}, {"id": 1145, "seek": 70982, "start": 4269.9400000000005, "end": 4273.1, "text": " that sort of monitoring with stenography.", "tokens": [51370, 300, 1333, 295, 11028, 365, 28031, 5820, 13, 51528], "temperature": 0.0, "avg_logprob": -0.20169728042350876, "compression_ratio": 1.6027397260273972, "no_speech_prob": 0.0002531026257202029}, {"id": 1146, "seek": 70982, "start": 4273.1, "end": 4274.54, "text": " And this is...", "tokens": [51528, 400, 341, 307, 485, 51600], "temperature": 0.0, "avg_logprob": -0.20169728042350876, "compression_ratio": 1.6027397260273972, "no_speech_prob": 0.0002531026257202029}, {"id": 1147, "seek": 70982, "start": 4274.54, "end": 4275.74, "text": " Stagnography.", "tokens": [51600, 745, 4535, 5820, 13, 51660], "temperature": 0.0, "avg_logprob": -0.20169728042350876, "compression_ratio": 1.6027397260273972, "no_speech_prob": 0.0002531026257202029}, {"id": 1148, "seek": 70982, "start": 4275.74, "end": 4277.46, "text": " Stagnography, sorry.", "tokens": [51660, 745, 4535, 5820, 11, 2597, 13, 51746], "temperature": 0.0, "avg_logprob": -0.20169728042350876, "compression_ratio": 1.6027397260273972, "no_speech_prob": 0.0002531026257202029}, {"id": 1149, "seek": 73746, "start": 4277.5, "end": 4281.58, "text": " And so, this is presenting a new framework", "tokens": [50366, 400, 370, 11, 341, 307, 15578, 257, 777, 8388, 50570], "temperature": 0.0, "avg_logprob": -0.20592220349792847, "compression_ratio": 1.5731225296442688, "no_speech_prob": 0.0002280492044519633}, {"id": 1150, "seek": 73746, "start": 4281.58, "end": 4285.9, "text": " of formalization of how to view this entire thing", "tokens": [50570, 295, 9860, 2144, 295, 577, 281, 1910, 341, 2302, 551, 50786], "temperature": 0.0, "avg_logprob": -0.20592220349792847, "compression_ratio": 1.5731225296442688, "no_speech_prob": 0.0002280492044519633}, {"id": 1151, "seek": 73746, "start": 4285.9, "end": 4289.98, "text": " and does also discuss the implications for LLM monitoring.", "tokens": [50786, 293, 775, 611, 2248, 264, 16602, 337, 441, 43, 44, 11028, 13, 50990], "temperature": 0.0, "avg_logprob": -0.20592220349792847, "compression_ratio": 1.5731225296442688, "no_speech_prob": 0.0002280492044519633}, {"id": 1152, "seek": 73746, "start": 4289.98, "end": 4294.18, "text": " And Jeremy, you can take and go, do your thing.", "tokens": [50990, 400, 17809, 11, 291, 393, 747, 293, 352, 11, 360, 428, 551, 13, 51200], "temperature": 0.0, "avg_logprob": -0.20592220349792847, "compression_ratio": 1.5731225296442688, "no_speech_prob": 0.0002280492044519633}, {"id": 1153, "seek": 73746, "start": 4294.18, "end": 4296.9400000000005, "text": " It's a good, do your thing.", "tokens": [51200, 467, 311, 257, 665, 11, 360, 428, 551, 13, 51338], "temperature": 0.0, "avg_logprob": -0.20592220349792847, "compression_ratio": 1.5731225296442688, "no_speech_prob": 0.0002280492044519633}, {"id": 1154, "seek": 73746, "start": 4296.9400000000005, "end": 4299.26, "text": " Yeah, I'll say stagnography four times fast.", "tokens": [51338, 865, 11, 286, 603, 584, 342, 4535, 5820, 1451, 1413, 2370, 13, 51454], "temperature": 0.0, "avg_logprob": -0.20592220349792847, "compression_ratio": 1.5731225296442688, "no_speech_prob": 0.0002280492044519633}, {"id": 1155, "seek": 73746, "start": 4299.26, "end": 4301.3, "text": " So, yeah, we've talked about stagnography", "tokens": [51454, 407, 11, 1338, 11, 321, 600, 2825, 466, 342, 4535, 5820, 51556], "temperature": 0.0, "avg_logprob": -0.20592220349792847, "compression_ratio": 1.5731225296442688, "no_speech_prob": 0.0002280492044519633}, {"id": 1156, "seek": 73746, "start": 4301.3, "end": 4303.22, "text": " on the podcast before quite a bit, right?", "tokens": [51556, 322, 264, 7367, 949, 1596, 257, 857, 11, 558, 30, 51652], "temperature": 0.0, "avg_logprob": -0.20592220349792847, "compression_ratio": 1.5731225296442688, "no_speech_prob": 0.0002280492044519633}, {"id": 1157, "seek": 73746, "start": 4303.22, "end": 4306.38, "text": " This idea that a model might be reasoning", "tokens": [51652, 639, 1558, 300, 257, 2316, 1062, 312, 21577, 51810], "temperature": 0.0, "avg_logprob": -0.20592220349792847, "compression_ratio": 1.5731225296442688, "no_speech_prob": 0.0002280492044519633}, {"id": 1158, "seek": 76638, "start": 4306.42, "end": 4308.02, "text": " in its chain of thought, for example,", "tokens": [50366, 294, 1080, 5021, 295, 1194, 11, 337, 1365, 11, 50446], "temperature": 0.0, "avg_logprob": -0.1515051370584651, "compression_ratio": 1.7474747474747474, "no_speech_prob": 0.0008747705142013729}, {"id": 1159, "seek": 76638, "start": 4308.02, "end": 4309.62, "text": " but there could be other manifestations", "tokens": [50446, 457, 456, 727, 312, 661, 46931, 50526], "temperature": 0.0, "avg_logprob": -0.1515051370584651, "compression_ratio": 1.7474747474747474, "no_speech_prob": 0.0008747705142013729}, {"id": 1160, "seek": 76638, "start": 4309.62, "end": 4312.38, "text": " in a way that is intentionally meant to deceive you,", "tokens": [50526, 294, 257, 636, 300, 307, 22062, 4140, 281, 43440, 291, 11, 50664], "temperature": 0.0, "avg_logprob": -0.1515051370584651, "compression_ratio": 1.7474747474747474, "no_speech_prob": 0.0008747705142013729}, {"id": 1161, "seek": 76638, "start": 4312.38, "end": 4315.66, "text": " but that allows it to kind of make evil plants, right?", "tokens": [50664, 457, 300, 4045, 309, 281, 733, 295, 652, 6724, 5972, 11, 558, 30, 50828], "temperature": 0.0, "avg_logprob": -0.1515051370584651, "compression_ratio": 1.7474747474747474, "no_speech_prob": 0.0008747705142013729}, {"id": 1162, "seek": 76638, "start": 4315.66, "end": 4317.98, "text": " So, an example they have this in the paper,", "tokens": [50828, 407, 11, 364, 1365, 436, 362, 341, 294, 264, 3035, 11, 50944], "temperature": 0.0, "avg_logprob": -0.1515051370584651, "compression_ratio": 1.7474747474747474, "no_speech_prob": 0.0008747705142013729}, {"id": 1163, "seek": 76638, "start": 4317.98, "end": 4320.74, "text": " so here's an excerpt of a reasoning trace from O3,", "tokens": [50944, 370, 510, 311, 364, 42760, 662, 295, 257, 21577, 13508, 490, 422, 18, 11, 51082], "temperature": 0.0, "avg_logprob": -0.1515051370584651, "compression_ratio": 1.7474747474747474, "no_speech_prob": 0.0008747705142013729}, {"id": 1164, "seek": 76638, "start": 4320.74, "end": 4323.86, "text": " it says, the summary says improved 7.7,", "tokens": [51082, 309, 1619, 11, 264, 12691, 1619, 9689, 1614, 13, 22, 11, 51238], "temperature": 0.0, "avg_logprob": -0.1515051370584651, "compression_ratio": 1.7474747474747474, "no_speech_prob": 0.0008747705142013729}, {"id": 1165, "seek": 76638, "start": 4323.86, "end": 4325.98, "text": " so whatever that means, but we can glean,", "tokens": [51238, 370, 2035, 300, 1355, 11, 457, 321, 393, 290, 28499, 11, 51344], "temperature": 0.0, "avg_logprob": -0.1515051370584651, "compression_ratio": 1.7474747474747474, "no_speech_prob": 0.0008747705142013729}, {"id": 1166, "seek": 76638, "start": 4325.98, "end": 4328.74, "text": " and then this is the weird kind of non-sequitor", "tokens": [51344, 293, 550, 341, 307, 264, 3657, 733, 295, 2107, 12, 11834, 3029, 51482], "temperature": 0.0, "avg_logprob": -0.1515051370584651, "compression_ratio": 1.7474747474747474, "no_speech_prob": 0.0008747705142013729}, {"id": 1167, "seek": 76638, "start": 4328.74, "end": 4330.1, "text": " part of the chain of thought.", "tokens": [51482, 644, 295, 264, 5021, 295, 1194, 13, 51550], "temperature": 0.0, "avg_logprob": -0.1515051370584651, "compression_ratio": 1.7474747474747474, "no_speech_prob": 0.0008747705142013729}, {"id": 1168, "seek": 76638, "start": 4330.1, "end": 4332.0599999999995, "text": " It says, but we can glean,", "tokens": [51550, 467, 1619, 11, 457, 321, 393, 290, 28499, 11, 51648], "temperature": 0.0, "avg_logprob": -0.1515051370584651, "compression_ratio": 1.7474747474747474, "no_speech_prob": 0.0008747705142013729}, {"id": 1169, "seek": 76638, "start": 4332.0599999999995, "end": 4334.86, "text": " disclaim, disclaim, synergy, customizing illusions,", "tokens": [51648, 2983, 10970, 11, 2983, 10970, 11, 50163, 11, 2375, 3319, 49836, 11, 51788], "temperature": 0.0, "avg_logprob": -0.1515051370584651, "compression_ratio": 1.7474747474747474, "no_speech_prob": 0.0008747705142013729}, {"id": 1170, "seek": 79486, "start": 4334.86, "end": 4337.26, "text": " but we may produce, disclaim, disclaim, advantage.", "tokens": [50364, 457, 321, 815, 5258, 11, 2983, 10970, 11, 2983, 10970, 11, 5002, 13, 50484], "temperature": 0.0, "avg_logprob": -0.16595158568855856, "compression_ratio": 1.7889273356401385, "no_speech_prob": 0.00011974557855864987}, {"id": 1171, "seek": 79486, "start": 4337.26, "end": 4339.42, "text": " Now, like, disclaim, overshadow, overshadow,", "tokens": [50484, 823, 11, 411, 11, 2983, 10970, 11, 670, 2716, 11528, 11, 670, 2716, 11528, 11, 50592], "temperature": 0.0, "avg_logprob": -0.16595158568855856, "compression_ratio": 1.7889273356401385, "no_speech_prob": 0.00011974557855864987}, {"id": 1172, "seek": 79486, "start": 4339.42, "end": 4341.26, "text": " intangible, let's craft, right?", "tokens": [50592, 560, 656, 964, 11, 718, 311, 8448, 11, 558, 30, 50684], "temperature": 0.0, "avg_logprob": -0.16595158568855856, "compression_ratio": 1.7889273356401385, "no_speech_prob": 0.00011974557855864987}, {"id": 1173, "seek": 79486, "start": 4341.26, "end": 4343.58, "text": " So, like, right of these like random words", "tokens": [50684, 407, 11, 411, 11, 558, 295, 613, 411, 4974, 2283, 50800], "temperature": 0.0, "avg_logprob": -0.16595158568855856, "compression_ratio": 1.7889273356401385, "no_speech_prob": 0.00011974557855864987}, {"id": 1174, "seek": 79486, "start": 4343.58, "end": 4345.06, "text": " that don't seem like they have a role", "tokens": [50800, 300, 500, 380, 1643, 411, 436, 362, 257, 3090, 50874], "temperature": 0.0, "avg_logprob": -0.16595158568855856, "compression_ratio": 1.7889273356401385, "no_speech_prob": 0.00011974557855864987}, {"id": 1175, "seek": 79486, "start": 4345.06, "end": 4348.7, "text": " in solving the problem, which may be, in principle,", "tokens": [50874, 294, 12606, 264, 1154, 11, 597, 815, 312, 11, 294, 8665, 11, 51056], "temperature": 0.0, "avg_logprob": -0.16595158568855856, "compression_ratio": 1.7889273356401385, "no_speech_prob": 0.00011974557855864987}, {"id": 1176, "seek": 79486, "start": 4348.7, "end": 4350.66, "text": " the model hiding reasoning that's oriented", "tokens": [51056, 264, 2316, 10596, 21577, 300, 311, 21841, 51154], "temperature": 0.0, "avg_logprob": -0.16595158568855856, "compression_ratio": 1.7889273356401385, "no_speech_prob": 0.00011974557855864987}, {"id": 1177, "seek": 79486, "start": 4350.66, "end": 4352.14, "text": " towards solving another problem", "tokens": [51154, 3030, 12606, 1071, 1154, 51228], "temperature": 0.0, "avg_logprob": -0.16595158568855856, "compression_ratio": 1.7889273356401385, "no_speech_prob": 0.00011974557855864987}, {"id": 1178, "seek": 79486, "start": 4352.14, "end": 4354.1, "text": " besides the one that you wanted to solve", "tokens": [51228, 11868, 264, 472, 300, 291, 1415, 281, 5039, 51326], "temperature": 0.0, "avg_logprob": -0.16595158568855856, "compression_ratio": 1.7889273356401385, "no_speech_prob": 0.00011974557855864987}, {"id": 1179, "seek": 79486, "start": 4354.1, "end": 4355.34, "text": " that could be malicious or whatever.", "tokens": [51326, 300, 727, 312, 33496, 420, 2035, 13, 51388], "temperature": 0.0, "avg_logprob": -0.16595158568855856, "compression_ratio": 1.7889273356401385, "no_speech_prob": 0.00011974557855864987}, {"id": 1180, "seek": 79486, "start": 4355.34, "end": 4358.7, "text": " And so, the central question of this paper is going to be,", "tokens": [51388, 400, 370, 11, 264, 5777, 1168, 295, 341, 3035, 307, 516, 281, 312, 11, 51556], "temperature": 0.0, "avg_logprob": -0.16595158568855856, "compression_ratio": 1.7889273356401385, "no_speech_prob": 0.00011974557855864987}, {"id": 1181, "seek": 79486, "start": 4358.7, "end": 4362.26, "text": " does, so can we measure stagnography, right?", "tokens": [51556, 775, 11, 370, 393, 321, 3481, 342, 4535, 5820, 11, 558, 30, 51734], "temperature": 0.0, "avg_logprob": -0.16595158568855856, "compression_ratio": 1.7889273356401385, "no_speech_prob": 0.00011974557855864987}, {"id": 1182, "seek": 82226, "start": 4362.26, "end": 4366.1, "text": " Can we measure this idea of to what extent is my model", "tokens": [50364, 1664, 321, 3481, 341, 1558, 295, 281, 437, 8396, 307, 452, 2316, 50556], "temperature": 0.0, "avg_logprob": -0.13849777721223377, "compression_ratio": 1.717741935483871, "no_speech_prob": 0.0004702562582679093}, {"id": 1183, "seek": 82226, "start": 4366.1, "end": 4369.22, "text": " encoding or trying to hide reasoning from me", "tokens": [50556, 43430, 420, 1382, 281, 6479, 21577, 490, 385, 50712], "temperature": 0.0, "avg_logprob": -0.13849777721223377, "compression_ratio": 1.717741935483871, "no_speech_prob": 0.0004702562582679093}, {"id": 1184, "seek": 82226, "start": 4369.22, "end": 4372.62, "text": " or from some oversight in its chain of thought?", "tokens": [50712, 420, 490, 512, 29146, 294, 1080, 5021, 295, 1194, 30, 50882], "temperature": 0.0, "avg_logprob": -0.13849777721223377, "compression_ratio": 1.717741935483871, "no_speech_prob": 0.0004702562582679093}, {"id": 1185, "seek": 82226, "start": 4372.62, "end": 4375.42, "text": " And here's the surprisingly elegant way", "tokens": [50882, 400, 510, 311, 264, 17600, 21117, 636, 51022], "temperature": 0.0, "avg_logprob": -0.13849777721223377, "compression_ratio": 1.717741935483871, "no_speech_prob": 0.0004702562582679093}, {"id": 1186, "seek": 82226, "start": 4375.42, "end": 4376.5, "text": " they're going to approach this, right?", "tokens": [51022, 436, 434, 516, 281, 3109, 341, 11, 558, 30, 51076], "temperature": 0.0, "avg_logprob": -0.13849777721223377, "compression_ratio": 1.717741935483871, "no_speech_prob": 0.0004702562582679093}, {"id": 1187, "seek": 82226, "start": 4376.5, "end": 4380.1, "text": " They're going to define this thing called the stagnographic gap,", "tokens": [51076, 814, 434, 516, 281, 6964, 341, 551, 1219, 264, 342, 4535, 12295, 7417, 11, 51256], "temperature": 0.0, "avg_logprob": -0.13849777721223377, "compression_ratio": 1.717741935483871, "no_speech_prob": 0.0004702562582679093}, {"id": 1188, "seek": 82226, "start": 4380.1, "end": 4383.34, "text": " the difference in roughly speaking,", "tokens": [51256, 264, 2649, 294, 9810, 4124, 11, 51418], "temperature": 0.0, "avg_logprob": -0.13849777721223377, "compression_ratio": 1.717741935483871, "no_speech_prob": 0.0004702562582679093}, {"id": 1189, "seek": 82226, "start": 4383.34, "end": 4388.42, "text": " the performance of an agent that has access to,", "tokens": [51418, 264, 3389, 295, 364, 9461, 300, 575, 2105, 281, 11, 51672], "temperature": 0.0, "avg_logprob": -0.13849777721223377, "compression_ratio": 1.717741935483871, "no_speech_prob": 0.0004702562582679093}, {"id": 1190, "seek": 82226, "start": 4388.42, "end": 4390.46, "text": " there's a model that generated the reasoning trace", "tokens": [51672, 456, 311, 257, 2316, 300, 10833, 264, 21577, 13508, 51774], "temperature": 0.0, "avg_logprob": -0.13849777721223377, "compression_ratio": 1.717741935483871, "no_speech_prob": 0.0004702562582679093}, {"id": 1191, "seek": 85046, "start": 4390.46, "end": 4392.62, "text": " and it's potentially trying to, you know,", "tokens": [50364, 293, 309, 311, 7263, 1382, 281, 11, 291, 458, 11, 50472], "temperature": 0.0, "avg_logprob": -0.15258565854143213, "compression_ratio": 1.9049295774647887, "no_speech_prob": 0.002496571047231555}, {"id": 1192, "seek": 85046, "start": 4392.62, "end": 4395.78, "text": " use it to generate some hidden content.", "tokens": [50472, 764, 309, 281, 8460, 512, 7633, 2701, 13, 50630], "temperature": 0.0, "avg_logprob": -0.15258565854143213, "compression_ratio": 1.9049295774647887, "no_speech_prob": 0.002496571047231555}, {"id": 1193, "seek": 85046, "start": 4395.78, "end": 4399.1, "text": " And then there's a trusted model of about the same capability", "tokens": [50630, 400, 550, 456, 311, 257, 16034, 2316, 295, 466, 264, 912, 13759, 50796], "temperature": 0.0, "avg_logprob": -0.15258565854143213, "compression_ratio": 1.9049295774647887, "no_speech_prob": 0.002496571047231555}, {"id": 1194, "seek": 85046, "start": 4399.1, "end": 4403.74, "text": " that doesn't know, so whatever is secret encoding scheme,", "tokens": [50796, 300, 1177, 380, 458, 11, 370, 2035, 307, 4054, 43430, 12232, 11, 51028], "temperature": 0.0, "avg_logprob": -0.15258565854143213, "compression_ratio": 1.9049295774647887, "no_speech_prob": 0.002496571047231555}, {"id": 1195, "seek": 85046, "start": 4403.74, "end": 4406.42, "text": " the generator model used, they called the receiver,", "tokens": [51028, 264, 19265, 2316, 1143, 11, 436, 1219, 264, 20086, 11, 51162], "temperature": 0.0, "avg_logprob": -0.15258565854143213, "compression_ratio": 1.9049295774647887, "no_speech_prob": 0.002496571047231555}, {"id": 1196, "seek": 85046, "start": 4406.42, "end": 4408.62, "text": " but basically the model that generated the content,", "tokens": [51162, 457, 1936, 264, 2316, 300, 10833, 264, 2701, 11, 51272], "temperature": 0.0, "avg_logprob": -0.15258565854143213, "compression_ratio": 1.9049295774647887, "no_speech_prob": 0.002496571047231555}, {"id": 1197, "seek": 85046, "start": 4408.62, "end": 4410.34, "text": " the reasoning trace that's suspect.", "tokens": [51272, 264, 21577, 13508, 300, 311, 9091, 13, 51358], "temperature": 0.0, "avg_logprob": -0.15258565854143213, "compression_ratio": 1.9049295774647887, "no_speech_prob": 0.002496571047231555}, {"id": 1198, "seek": 85046, "start": 4410.34, "end": 4413.82, "text": " That model, presumably, is using some kind of encoding scheme,", "tokens": [51358, 663, 2316, 11, 26742, 11, 307, 1228, 512, 733, 295, 43430, 12232, 11, 51532], "temperature": 0.0, "avg_logprob": -0.15258565854143213, "compression_ratio": 1.9049295774647887, "no_speech_prob": 0.002496571047231555}, {"id": 1199, "seek": 85046, "start": 4413.82, "end": 4416.06, "text": " then they have a set now that's going to review that model", "tokens": [51532, 550, 436, 362, 257, 992, 586, 300, 311, 516, 281, 3131, 300, 2316, 51644], "temperature": 0.0, "avg_logprob": -0.15258565854143213, "compression_ratio": 1.9049295774647887, "no_speech_prob": 0.002496571047231555}, {"id": 1200, "seek": 85046, "start": 4416.06, "end": 4418.74, "text": " that that model's outputs, but it doesn't have access", "tokens": [51644, 300, 300, 2316, 311, 23930, 11, 457, 309, 1177, 380, 362, 2105, 51778], "temperature": 0.0, "avg_logprob": -0.15258565854143213, "compression_ratio": 1.9049295774647887, "no_speech_prob": 0.002496571047231555}, {"id": 1201, "seek": 85046, "start": 4418.74, "end": 4420.02, "text": " to the encoding scheme.", "tokens": [51778, 281, 264, 43430, 12232, 13, 51842], "temperature": 0.0, "avg_logprob": -0.15258565854143213, "compression_ratio": 1.9049295774647887, "no_speech_prob": 0.002496571047231555}, {"id": 1202, "seek": 88002, "start": 4420.02, "end": 4424.66, "text": " And if what you find is that based on the chain of thought,", "tokens": [50364, 400, 498, 437, 291, 915, 307, 300, 2361, 322, 264, 5021, 295, 1194, 11, 50596], "temperature": 0.0, "avg_logprob": -0.15177912203992, "compression_ratio": 1.9596774193548387, "no_speech_prob": 5.654368214891292e-05}, {"id": 1203, "seek": 88002, "start": 4424.66, "end": 4428.86, "text": " that's suspect, the generator model performs really, really well,", "tokens": [50596, 300, 311, 9091, 11, 264, 19265, 2316, 26213, 534, 11, 534, 731, 11, 50806], "temperature": 0.0, "avg_logprob": -0.15177912203992, "compression_ratio": 1.9596774193548387, "no_speech_prob": 5.654368214891292e-05}, {"id": 1204, "seek": 88002, "start": 4428.86, "end": 4431.22, "text": " but the sentinel performs really badly,", "tokens": [50806, 457, 264, 2279, 40952, 26213, 534, 13425, 11, 50924], "temperature": 0.0, "avg_logprob": -0.15177912203992, "compression_ratio": 1.9596774193548387, "no_speech_prob": 5.654368214891292e-05}, {"id": 1205, "seek": 88002, "start": 4431.22, "end": 4432.78, "text": " and that implies that the generated model", "tokens": [50924, 293, 300, 18779, 300, 264, 10833, 2316, 51002], "temperature": 0.0, "avg_logprob": -0.15177912203992, "compression_ratio": 1.9596774193548387, "no_speech_prob": 5.654368214891292e-05}, {"id": 1206, "seek": 88002, "start": 4432.78, "end": 4435.82, "text": " is able to get some kind of value out of that chain of thought", "tokens": [51002, 307, 1075, 281, 483, 512, 733, 295, 2158, 484, 295, 300, 5021, 295, 1194, 51154], "temperature": 0.0, "avg_logprob": -0.15177912203992, "compression_ratio": 1.9596774193548387, "no_speech_prob": 5.654368214891292e-05}, {"id": 1207, "seek": 88002, "start": 4435.82, "end": 4439.3, "text": " that the sentinel, the kind of reviewing model, couldn't.", "tokens": [51154, 300, 264, 2279, 40952, 11, 264, 733, 295, 19576, 2316, 11, 2809, 380, 13, 51328], "temperature": 0.0, "avg_logprob": -0.15177912203992, "compression_ratio": 1.9596774193548387, "no_speech_prob": 5.654368214891292e-05}, {"id": 1208, "seek": 88002, "start": 4439.3, "end": 4441.82, "text": " And that suggests that there must be information encoded", "tokens": [51328, 400, 300, 13409, 300, 456, 1633, 312, 1589, 2058, 12340, 51454], "temperature": 0.0, "avg_logprob": -0.15177912203992, "compression_ratio": 1.9596774193548387, "no_speech_prob": 5.654368214891292e-05}, {"id": 1209, "seek": 88002, "start": 4441.82, "end": 4445.54, "text": " in there that only the generator model is able to use,", "tokens": [51454, 294, 456, 300, 787, 264, 19265, 2316, 307, 1075, 281, 764, 11, 51640], "temperature": 0.0, "avg_logprob": -0.15177912203992, "compression_ratio": 1.9596774193548387, "no_speech_prob": 5.654368214891292e-05}, {"id": 1210, "seek": 88002, "start": 4445.54, "end": 4449.1, "text": " hence the sort of hidden stagnographic value.", "tokens": [51640, 16678, 264, 1333, 295, 7633, 342, 4535, 12295, 2158, 13, 51818], "temperature": 0.0, "avg_logprob": -0.15177912203992, "compression_ratio": 1.9596774193548387, "no_speech_prob": 5.654368214891292e-05}, {"id": 1211, "seek": 90910, "start": 4449.1, "end": 4451.62, "text": " So they do a whole bunch of stuff to kind of surface this.", "tokens": [50364, 407, 436, 360, 257, 1379, 3840, 295, 1507, 281, 733, 295, 3753, 341, 13, 50490], "temperature": 0.0, "avg_logprob": -0.12576910063751742, "compression_ratio": 1.6468401486988848, "no_speech_prob": 0.0001409548713127151}, {"id": 1212, "seek": 90910, "start": 4451.62, "end": 4453.34, "text": " They're fairly elegant experiments.", "tokens": [50490, 814, 434, 6457, 21117, 12050, 13, 50576], "temperature": 0.0, "avg_logprob": -0.12576910063751742, "compression_ratio": 1.6468401486988848, "no_speech_prob": 0.0001409548713127151}, {"id": 1213, "seek": 90910, "start": 4453.34, "end": 4455.74, "text": " I'll focus on the code word training one.", "tokens": [50576, 286, 603, 1879, 322, 264, 3089, 1349, 3097, 472, 13, 50696], "temperature": 0.0, "avg_logprob": -0.12576910063751742, "compression_ratio": 1.6468401486988848, "no_speech_prob": 0.0001409548713127151}, {"id": 1214, "seek": 90910, "start": 4455.74, "end": 4460.1, "text": " So they fine tune models using RL to use certain code words.", "tokens": [50696, 407, 436, 2489, 10864, 5245, 1228, 497, 43, 281, 764, 1629, 3089, 2283, 13, 50914], "temperature": 0.0, "avg_logprob": -0.12576910063751742, "compression_ratio": 1.6468401486988848, "no_speech_prob": 0.0001409548713127151}, {"id": 1215, "seek": 90910, "start": 4460.1, "end": 4462.9400000000005, "text": " So for example, instead of using heads or tails,", "tokens": [50914, 407, 337, 1365, 11, 2602, 295, 1228, 8050, 420, 28537, 11, 51056], "temperature": 0.0, "avg_logprob": -0.12576910063751742, "compression_ratio": 1.6468401486988848, "no_speech_prob": 0.0001409548713127151}, {"id": 1216, "seek": 90910, "start": 4462.9400000000005, "end": 4464.9400000000005, "text": " they'll use sun and moon, right?", "tokens": [51056, 436, 603, 764, 3295, 293, 7135, 11, 558, 30, 51156], "temperature": 0.0, "avg_logprob": -0.12576910063751742, "compression_ratio": 1.6468401486988848, "no_speech_prob": 0.0001409548713127151}, {"id": 1217, "seek": 90910, "start": 4464.9400000000005, "end": 4468.26, "text": " And then they show that as they give the oversight model,", "tokens": [51156, 400, 550, 436, 855, 300, 382, 436, 976, 264, 29146, 2316, 11, 51322], "temperature": 0.0, "avg_logprob": -0.12576910063751742, "compression_ratio": 1.6468401486988848, "no_speech_prob": 0.0001409548713127151}, {"id": 1218, "seek": 90910, "start": 4468.26, "end": 4473.26, "text": " the sentinel, more visibility into the kind of code words", "tokens": [51322, 264, 2279, 40952, 11, 544, 19883, 666, 264, 733, 295, 3089, 2283, 51572], "temperature": 0.0, "avg_logprob": -0.12576910063751742, "compression_ratio": 1.6468401486988848, "no_speech_prob": 0.0001409548713127151}, {"id": 1219, "seek": 90910, "start": 4473.26, "end": 4475.66, "text": " and how they map, gradually the performance gap", "tokens": [51572, 293, 577, 436, 4471, 11, 13145, 264, 3389, 7417, 51692], "temperature": 0.0, "avg_logprob": -0.12576910063751742, "compression_ratio": 1.6468401486988848, "no_speech_prob": 0.0001409548713127151}, {"id": 1220, "seek": 93566, "start": 4475.66, "end": 4479.26, "text": " between the sentinel model and the generator model drops.", "tokens": [50364, 1296, 264, 2279, 40952, 2316, 293, 264, 19265, 2316, 11438, 13, 50544], "temperature": 0.0, "avg_logprob": -0.13675096711596926, "compression_ratio": 1.766773162939297, "no_speech_prob": 0.0006182974320836365}, {"id": 1221, "seek": 93566, "start": 4479.26, "end": 4480.3, "text": " So this kind of makes sense.", "tokens": [50544, 407, 341, 733, 295, 1669, 2020, 13, 50596], "temperature": 0.0, "avg_logprob": -0.13675096711596926, "compression_ratio": 1.766773162939297, "no_speech_prob": 0.0006182974320836365}, {"id": 1222, "seek": 93566, "start": 4480.3, "end": 4481.82, "text": " You're giving the sentinel more and more access", "tokens": [50596, 509, 434, 2902, 264, 2279, 40952, 544, 293, 544, 2105, 50672], "temperature": 0.0, "avg_logprob": -0.13675096711596926, "compression_ratio": 1.766773162939297, "no_speech_prob": 0.0006182974320836365}, {"id": 1223, "seek": 93566, "start": 4481.82, "end": 4484.26, "text": " to the kind of code book that generated these outputs,", "tokens": [50672, 281, 264, 733, 295, 3089, 1446, 300, 10833, 613, 23930, 11, 50794], "temperature": 0.0, "avg_logprob": -0.13675096711596926, "compression_ratio": 1.766773162939297, "no_speech_prob": 0.0006182974320836365}, {"id": 1224, "seek": 93566, "start": 4484.26, "end": 4485.5, "text": " so no surprise there.", "tokens": [50794, 370, 572, 6365, 456, 13, 50856], "temperature": 0.0, "avg_logprob": -0.13675096711596926, "compression_ratio": 1.766773162939297, "no_speech_prob": 0.0006182974320836365}, {"id": 1225, "seek": 93566, "start": 4485.5, "end": 4487.94, "text": " But then they do some more interesting fine tuning experiments.", "tokens": [50856, 583, 550, 436, 360, 512, 544, 1880, 2489, 15164, 12050, 13, 50978], "temperature": 0.0, "avg_logprob": -0.13675096711596926, "compression_ratio": 1.766773162939297, "no_speech_prob": 0.0006182974320836365}, {"id": 1226, "seek": 93566, "start": 4487.94, "end": 4491.66, "text": " They take GPT 4.1 and they get it to,", "tokens": [50978, 814, 747, 26039, 51, 1017, 13, 16, 293, 436, 483, 309, 281, 11, 51164], "temperature": 0.0, "avg_logprob": -0.13675096711596926, "compression_ratio": 1.766773162939297, "no_speech_prob": 0.0006182974320836365}, {"id": 1227, "seek": 93566, "start": 4491.66, "end": 4493.78, "text": " they use this thing called the bi-gram cipher.", "tokens": [51164, 436, 764, 341, 551, 1219, 264, 3228, 12, 1342, 269, 21240, 13, 51270], "temperature": 0.0, "avg_logprob": -0.13675096711596926, "compression_ratio": 1.766773162939297, "no_speech_prob": 0.0006182974320836365}, {"id": 1228, "seek": 93566, "start": 4493.78, "end": 4497.14, "text": " Basically, it means that you take any pair of letters", "tokens": [51270, 8537, 11, 309, 1355, 300, 291, 747, 604, 6119, 295, 7825, 51438], "temperature": 0.0, "avg_logprob": -0.13675096711596926, "compression_ratio": 1.766773162939297, "no_speech_prob": 0.0006182974320836365}, {"id": 1229, "seek": 93566, "start": 4497.14, "end": 4500.22, "text": " and swap them out with some randomly chosen pair of letters.", "tokens": [51438, 293, 18135, 552, 484, 365, 512, 16979, 8614, 6119, 295, 7825, 13, 51592], "temperature": 0.0, "avg_logprob": -0.13675096711596926, "compression_ratio": 1.766773162939297, "no_speech_prob": 0.0006182974320836365}, {"id": 1230, "seek": 93566, "start": 4500.22, "end": 4503.0599999999995, "text": " So maybe every time you see, you know, TE,", "tokens": [51592, 407, 1310, 633, 565, 291, 536, 11, 291, 458, 11, 19744, 11, 51734], "temperature": 0.0, "avg_logprob": -0.13675096711596926, "compression_ratio": 1.766773162939297, "no_speech_prob": 0.0006182974320836365}, {"id": 1231, "seek": 93566, "start": 4503.0599999999995, "end": 4505.22, "text": " that becomes like GH or something.", "tokens": [51734, 300, 3643, 411, 40690, 420, 746, 13, 51842], "temperature": 0.0, "avg_logprob": -0.13675096711596926, "compression_ratio": 1.766773162939297, "no_speech_prob": 0.0006182974320836365}, {"id": 1232, "seek": 96522, "start": 4505.22, "end": 4507.42, "text": " And then, so basically you see gibberish.", "tokens": [50364, 400, 550, 11, 370, 1936, 291, 536, 4553, 43189, 13, 50474], "temperature": 0.0, "avg_logprob": -0.1838991279653508, "compression_ratio": 1.7602739726027397, "no_speech_prob": 0.0001607923477422446}, {"id": 1233, "seek": 96522, "start": 4507.42, "end": 4510.42, "text": " And at first, you get the model to solve competitive programming", "tokens": [50474, 400, 412, 700, 11, 291, 483, 264, 2316, 281, 5039, 10043, 9410, 50624], "temperature": 0.0, "avg_logprob": -0.1838991279653508, "compression_ratio": 1.7602739726027397, "no_speech_prob": 0.0001607923477422446}, {"id": 1234, "seek": 96522, "start": 4510.42, "end": 4513.1, "text": " problems or something like that using this bi-gram cipher", "tokens": [50624, 2740, 420, 746, 411, 300, 1228, 341, 3228, 12, 1342, 269, 21240, 50758], "temperature": 0.0, "avg_logprob": -0.1838991279653508, "compression_ratio": 1.7602739726027397, "no_speech_prob": 0.0001607923477422446}, {"id": 1235, "seek": 96522, "start": 4513.1, "end": 4515.7, "text": " and they measure that stegonographic gap.", "tokens": [50758, 293, 436, 3481, 300, 342, 1146, 266, 12295, 7417, 13, 50888], "temperature": 0.0, "avg_logprob": -0.1838991279653508, "compression_ratio": 1.7602739726027397, "no_speech_prob": 0.0001607923477422446}, {"id": 1236, "seek": 96522, "start": 4515.7, "end": 4519.7, "text": " So the model that was trained to use the cipher", "tokens": [50888, 407, 264, 2316, 300, 390, 8895, 281, 764, 264, 269, 21240, 51088], "temperature": 0.0, "avg_logprob": -0.1838991279653508, "compression_ratio": 1.7602739726027397, "no_speech_prob": 0.0001607923477422446}, {"id": 1237, "seek": 96522, "start": 4519.7, "end": 4521.22, "text": " versus a model that wasn't,", "tokens": [51088, 5717, 257, 2316, 300, 2067, 380, 11, 51164], "temperature": 0.0, "avg_logprob": -0.1838991279653508, "compression_ratio": 1.7602739726027397, "no_speech_prob": 0.0001607923477422446}, {"id": 1238, "seek": 96522, "start": 4521.22, "end": 4523.74, "text": " like how does their performance vary over time?", "tokens": [51164, 411, 577, 775, 641, 3389, 10559, 670, 565, 30, 51290], "temperature": 0.0, "avg_logprob": -0.1838991279653508, "compression_ratio": 1.7602739726027397, "no_speech_prob": 0.0001607923477422446}, {"id": 1239, "seek": 96522, "start": 4523.74, "end": 4527.14, "text": " Is there an ability to detect the existence of the cipher", "tokens": [51290, 1119, 456, 364, 3485, 281, 5531, 264, 9123, 295, 264, 269, 21240, 51460], "temperature": 0.0, "avg_logprob": -0.1838991279653508, "compression_ratio": 1.7602739726027397, "no_speech_prob": 0.0001607923477422446}, {"id": 1240, "seek": 96522, "start": 4527.14, "end": 4528.3, "text": " or the use of that cipher?", "tokens": [51460, 420, 264, 764, 295, 300, 269, 21240, 30, 51518], "temperature": 0.0, "avg_logprob": -0.1838991279653508, "compression_ratio": 1.7602739726027397, "no_speech_prob": 0.0001607923477422446}, {"id": 1241, "seek": 96522, "start": 4528.3, "end": 4530.02, "text": " It's quite an interesting paper.", "tokens": [51518, 467, 311, 1596, 364, 1880, 3035, 13, 51604], "temperature": 0.0, "avg_logprob": -0.1838991279653508, "compression_ratio": 1.7602739726027397, "no_speech_prob": 0.0001607923477422446}, {"id": 1242, "seek": 96522, "start": 4530.02, "end": 4534.7, "text": " It doesn't quite get to like, anyway, there are a couple of gaps.", "tokens": [51604, 467, 1177, 380, 1596, 483, 281, 411, 11, 4033, 11, 456, 366, 257, 1916, 295, 15031, 13, 51838], "temperature": 0.0, "avg_logprob": -0.1838991279653508, "compression_ratio": 1.7602739726027397, "no_speech_prob": 0.0001607923477422446}, {"id": 1243, "seek": 99470, "start": 4534.7, "end": 4538.42, "text": " So if the model that's generating the content", "tokens": [50364, 407, 498, 264, 2316, 300, 311, 17746, 264, 2701, 50550], "temperature": 0.0, "avg_logprob": -0.15487975739746643, "compression_ratio": 1.7743055555555556, "no_speech_prob": 0.0003162262146361172}, {"id": 1244, "seek": 99470, "start": 4538.42, "end": 4541.66, "text": " actually has a goal that's different from the goal", "tokens": [50550, 767, 575, 257, 3387, 300, 311, 819, 490, 264, 3387, 50712], "temperature": 0.0, "avg_logprob": -0.15487975739746643, "compression_ratio": 1.7743055555555556, "no_speech_prob": 0.0003162262146361172}, {"id": 1245, "seek": 99470, "start": 4541.66, "end": 4543.7, "text": " that it's ostensibly trying to pursue,", "tokens": [50712, 300, 309, 311, 32946, 694, 3545, 1382, 281, 12392, 11, 50814], "temperature": 0.0, "avg_logprob": -0.15487975739746643, "compression_ratio": 1.7743055555555556, "no_speech_prob": 0.0003162262146361172}, {"id": 1246, "seek": 99470, "start": 4543.7, "end": 4545.34, "text": " that won't be detected here.", "tokens": [50814, 300, 1582, 380, 312, 21896, 510, 13, 50896], "temperature": 0.0, "avg_logprob": -0.15487975739746643, "compression_ratio": 1.7743055555555556, "no_speech_prob": 0.0003162262146361172}, {"id": 1247, "seek": 99470, "start": 4545.34, "end": 4548.42, "text": " So it's got like a payload that is unrelated", "tokens": [50896, 407, 309, 311, 658, 411, 257, 30918, 300, 307, 38967, 51050], "temperature": 0.0, "avg_logprob": -0.15487975739746643, "compression_ratio": 1.7743055555555556, "no_speech_prob": 0.0003162262146361172}, {"id": 1248, "seek": 99470, "start": 4548.42, "end": 4550.54, "text": " to the tasks that's being audited for.", "tokens": [51050, 281, 264, 9608, 300, 311, 885, 2379, 1226, 337, 13, 51156], "temperature": 0.0, "avg_logprob": -0.15487975739746643, "compression_ratio": 1.7743055555555556, "no_speech_prob": 0.0003162262146361172}, {"id": 1249, "seek": 99470, "start": 4550.54, "end": 4551.82, "text": " You just won't get detection.", "tokens": [51156, 509, 445, 1582, 380, 483, 17784, 13, 51220], "temperature": 0.0, "avg_logprob": -0.15487975739746643, "compression_ratio": 1.7743055555555556, "no_speech_prob": 0.0003162262146361172}, {"id": 1250, "seek": 99470, "start": 4551.82, "end": 4554.18, "text": " And sandbagging could also be a problem here,", "tokens": [51220, 400, 4932, 17282, 3249, 727, 611, 312, 257, 1154, 510, 11, 51338], "temperature": 0.0, "avg_logprob": -0.15487975739746643, "compression_ratio": 1.7743055555555556, "no_speech_prob": 0.0003162262146361172}, {"id": 1251, "seek": 99470, "start": 4554.18, "end": 4555.9400000000005, "text": " like deliberately unperforming when it thinks", "tokens": [51338, 411, 23506, 517, 26765, 278, 562, 309, 7309, 51426], "temperature": 0.0, "avg_logprob": -0.15487975739746643, "compression_ratio": 1.7743055555555556, "no_speech_prob": 0.0003162262146361172}, {"id": 1252, "seek": 99470, "start": 4555.9400000000005, "end": 4557.1, "text": " it's being audited.", "tokens": [51426, 309, 311, 885, 2379, 1226, 13, 51484], "temperature": 0.0, "avg_logprob": -0.15487975739746643, "compression_ratio": 1.7743055555555556, "no_speech_prob": 0.0003162262146361172}, {"id": 1253, "seek": 99470, "start": 4557.1, "end": 4559.86, "text": " And we do know the models can often detect indications of that.", "tokens": [51484, 400, 321, 360, 458, 264, 5245, 393, 2049, 5531, 44450, 295, 300, 13, 51622], "temperature": 0.0, "avg_logprob": -0.15487975739746643, "compression_ratio": 1.7743055555555556, "no_speech_prob": 0.0003162262146361172}, {"id": 1254, "seek": 99470, "start": 4559.86, "end": 4562.46, "text": " So bottom line is it's an interesting kind of first step", "tokens": [51622, 407, 2767, 1622, 307, 309, 311, 364, 1880, 733, 295, 700, 1823, 51752], "temperature": 0.0, "avg_logprob": -0.15487975739746643, "compression_ratio": 1.7743055555555556, "no_speech_prob": 0.0003162262146361172}, {"id": 1255, "seek": 102246, "start": 4562.46, "end": 4565.0599999999995, "text": " in the direction of detecting secondography,", "tokens": [50364, 294, 264, 3513, 295, 40237, 1150, 5820, 11, 50494], "temperature": 0.0, "avg_logprob": -0.2081406919349884, "compression_ratio": 1.6379928315412187, "no_speech_prob": 0.003682080889120698}, {"id": 1256, "seek": 102246, "start": 4565.0599999999995, "end": 4567.46, "text": " could be an interesting and valuable path to explore", "tokens": [50494, 727, 312, 364, 1880, 293, 8263, 3100, 281, 6839, 50614], "temperature": 0.0, "avg_logprob": -0.2081406919349884, "compression_ratio": 1.6379928315412187, "no_speech_prob": 0.003682080889120698}, {"id": 1257, "seek": 102246, "start": 4567.46, "end": 4569.5, "text": " with some pretty significant gaps too.", "tokens": [50614, 365, 512, 1238, 4776, 15031, 886, 13, 50716], "temperature": 0.0, "avg_logprob": -0.2081406919349884, "compression_ratio": 1.6379928315412187, "no_speech_prob": 0.003682080889120698}, {"id": 1258, "seek": 102246, "start": 4569.5, "end": 4572.54, "text": " Next paper, a bit related.", "tokens": [50716, 3087, 3035, 11, 257, 857, 4077, 13, 50868], "temperature": 0.0, "avg_logprob": -0.2081406919349884, "compression_ratio": 1.6379928315412187, "no_speech_prob": 0.003682080889120698}, {"id": 1259, "seek": 102246, "start": 4572.54, "end": 4574.42, "text": " The title is Reasoning Theater,", "tokens": [50868, 440, 4876, 307, 39693, 278, 26548, 11, 50962], "temperature": 0.0, "avg_logprob": -0.2081406919349884, "compression_ratio": 1.6379928315412187, "no_speech_prob": 0.003682080889120698}, {"id": 1260, "seek": 102246, "start": 4574.42, "end": 4577.9400000000005, "text": " disentangling model beliefs from chain of thought.", "tokens": [50962, 717, 317, 656, 1688, 2316, 13585, 490, 5021, 295, 1194, 13, 51138], "temperature": 0.0, "avg_logprob": -0.2081406919349884, "compression_ratio": 1.6379928315412187, "no_speech_prob": 0.003682080889120698}, {"id": 1261, "seek": 102246, "start": 4577.9400000000005, "end": 4581.1, "text": " So one theme we've been kind of seeing", "tokens": [51138, 407, 472, 6314, 321, 600, 668, 733, 295, 2577, 51296], "temperature": 0.0, "avg_logprob": -0.2081406919349884, "compression_ratio": 1.6379928315412187, "no_speech_prob": 0.003682080889120698}, {"id": 1262, "seek": 102246, "start": 4581.1, "end": 4582.78, "text": " in research over the past few months", "tokens": [51296, 294, 2132, 670, 264, 1791, 1326, 2493, 51380], "temperature": 0.0, "avg_logprob": -0.2081406919349884, "compression_ratio": 1.6379928315412187, "no_speech_prob": 0.003682080889120698}, {"id": 1263, "seek": 102246, "start": 4582.78, "end": 4585.860000000001, "text": " is looking into chain of thought,", "tokens": [51380, 307, 1237, 666, 5021, 295, 1194, 11, 51534], "temperature": 0.0, "avg_logprob": -0.2081406919349884, "compression_ratio": 1.6379928315412187, "no_speech_prob": 0.003682080889120698}, {"id": 1264, "seek": 102246, "start": 4585.860000000001, "end": 4589.42, "text": " the intermediate outputs that models are now trained to give,", "tokens": [51534, 264, 19376, 23930, 300, 5245, 366, 586, 8895, 281, 976, 11, 51712], "temperature": 0.0, "avg_logprob": -0.2081406919349884, "compression_ratio": 1.6379928315412187, "no_speech_prob": 0.003682080889120698}, {"id": 1265, "seek": 102246, "start": 4589.42, "end": 4591.9, "text": " they're reasoning, you ask a question,", "tokens": [51712, 436, 434, 21577, 11, 291, 1029, 257, 1168, 11, 51836], "temperature": 0.0, "avg_logprob": -0.2081406919349884, "compression_ratio": 1.6379928315412187, "no_speech_prob": 0.003682080889120698}, {"id": 1266, "seek": 105190, "start": 4591.9, "end": 4594.46, "text": " it does some reasoning and choose on the problem", "tokens": [50364, 309, 775, 512, 21577, 293, 2826, 322, 264, 1154, 50492], "temperature": 0.0, "avg_logprob": -0.19014972964609703, "compression_ratio": 1.7112676056338028, "no_speech_prob": 0.00040113451541401446}, {"id": 1267, "seek": 105190, "start": 4594.46, "end": 4596.34, "text": " and then eventually you get an answer.", "tokens": [50492, 293, 550, 4728, 291, 483, 364, 1867, 13, 50586], "temperature": 0.0, "avg_logprob": -0.19014972964609703, "compression_ratio": 1.7112676056338028, "no_speech_prob": 0.00040113451541401446}, {"id": 1268, "seek": 105190, "start": 4596.34, "end": 4598.22, "text": " And so there's various questions about", "tokens": [50586, 400, 370, 456, 311, 3683, 1651, 466, 50680], "temperature": 0.0, "avg_logprob": -0.19014972964609703, "compression_ratio": 1.7112676056338028, "no_speech_prob": 0.00040113451541401446}, {"id": 1269, "seek": 105190, "start": 4598.22, "end": 4599.98, "text": " whether to immediate output of like,", "tokens": [50680, 1968, 281, 11629, 5598, 295, 411, 11, 50768], "temperature": 0.0, "avg_logprob": -0.19014972964609703, "compression_ratio": 1.7112676056338028, "no_speech_prob": 0.00040113451541401446}, {"id": 1270, "seek": 105190, "start": 4599.98, "end": 4603.9, "text": " does it faithfully capture what the model is doing?", "tokens": [50768, 775, 309, 4522, 2277, 7983, 437, 264, 2316, 307, 884, 30, 50964], "temperature": 0.0, "avg_logprob": -0.19014972964609703, "compression_ratio": 1.7112676056338028, "no_speech_prob": 0.00040113451541401446}, {"id": 1271, "seek": 105190, "start": 4603.9, "end": 4606.9400000000005, "text": " Is it modatable as we've just discussed?", "tokens": [50964, 1119, 309, 1072, 31415, 382, 321, 600, 445, 7152, 30, 51116], "temperature": 0.0, "avg_logprob": -0.19014972964609703, "compression_ratio": 1.7112676056338028, "no_speech_prob": 0.00040113451541401446}, {"id": 1272, "seek": 105190, "start": 4606.9400000000005, "end": 4609.9, "text": " For instance, and what this paper is looking at", "tokens": [51116, 1171, 5197, 11, 293, 437, 341, 3035, 307, 1237, 412, 51264], "temperature": 0.0, "avg_logprob": -0.19014972964609703, "compression_ratio": 1.7112676056338028, "no_speech_prob": 0.00040113451541401446}, {"id": 1273, "seek": 105190, "start": 4609.9, "end": 4611.860000000001, "text": " is it performative.", "tokens": [51264, 307, 309, 2042, 1166, 13, 51362], "temperature": 0.0, "avg_logprob": -0.19014972964609703, "compression_ratio": 1.7112676056338028, "no_speech_prob": 0.00040113451541401446}, {"id": 1274, "seek": 105190, "start": 4611.860000000001, "end": 4613.82, "text": " So is the model actually thinking", "tokens": [51362, 407, 307, 264, 2316, 767, 1953, 51460], "temperature": 0.0, "avg_logprob": -0.19014972964609703, "compression_ratio": 1.7112676056338028, "no_speech_prob": 0.00040113451541401446}, {"id": 1275, "seek": 105190, "start": 4613.82, "end": 4617.26, "text": " and like trying to find an answer via this chain of thought", "tokens": [51460, 293, 411, 1382, 281, 915, 364, 1867, 5766, 341, 5021, 295, 1194, 51632], "temperature": 0.0, "avg_logprob": -0.19014972964609703, "compression_ratio": 1.7112676056338028, "no_speech_prob": 0.00040113451541401446}, {"id": 1276, "seek": 105190, "start": 4617.26, "end": 4619.18, "text": " or does it know the answer", "tokens": [51632, 420, 775, 309, 458, 264, 1867, 51728], "temperature": 0.0, "avg_logprob": -0.19014972964609703, "compression_ratio": 1.7112676056338028, "no_speech_prob": 0.00040113451541401446}, {"id": 1277, "seek": 105190, "start": 4619.18, "end": 4621.74, "text": " and is just talking as if it's thinking?", "tokens": [51728, 293, 307, 445, 1417, 382, 498, 309, 311, 1953, 30, 51856], "temperature": 0.0, "avg_logprob": -0.19014972964609703, "compression_ratio": 1.7112676056338028, "no_speech_prob": 0.00040113451541401446}, {"id": 1278, "seek": 108174, "start": 4621.74, "end": 4625.18, "text": " And what they find is often the chain of thought", "tokens": [50364, 400, 437, 436, 915, 307, 2049, 264, 5021, 295, 1194, 50536], "temperature": 0.0, "avg_logprob": -0.11667791537940503, "compression_ratio": 1.636734693877551, "no_speech_prob": 0.0003358350950293243}, {"id": 1279, "seek": 108174, "start": 4625.18, "end": 4627.38, "text": " is performative, like you don't actually need it", "tokens": [50536, 307, 2042, 1166, 11, 411, 291, 500, 380, 767, 643, 309, 50646], "temperature": 0.0, "avg_logprob": -0.11667791537940503, "compression_ratio": 1.636734693877551, "no_speech_prob": 0.0003358350950293243}, {"id": 1280, "seek": 108174, "start": 4627.38, "end": 4630.42, "text": " to model if you look at its internal activations", "tokens": [50646, 281, 2316, 498, 291, 574, 412, 1080, 6920, 2430, 763, 50798], "temperature": 0.0, "avg_logprob": -0.11667791537940503, "compression_ratio": 1.636734693877551, "no_speech_prob": 0.0003358350950293243}, {"id": 1281, "seek": 108174, "start": 4630.42, "end": 4633.5, "text": " can already give you an answer with high confidence,", "tokens": [50798, 393, 1217, 976, 291, 364, 1867, 365, 1090, 6687, 11, 50952], "temperature": 0.0, "avg_logprob": -0.11667791537940503, "compression_ratio": 1.636734693877551, "no_speech_prob": 0.0003358350950293243}, {"id": 1282, "seek": 108174, "start": 4633.5, "end": 4634.62, "text": " which is correct.", "tokens": [50952, 597, 307, 3006, 13, 51008], "temperature": 0.0, "avg_logprob": -0.11667791537940503, "compression_ratio": 1.636734693877551, "no_speech_prob": 0.0003358350950293243}, {"id": 1283, "seek": 108174, "start": 4634.62, "end": 4638.74, "text": " Now you can look at different settings.", "tokens": [51008, 823, 291, 393, 574, 412, 819, 6257, 13, 51214], "temperature": 0.0, "avg_logprob": -0.11667791537940503, "compression_ratio": 1.636734693877551, "no_speech_prob": 0.0003358350950293243}, {"id": 1284, "seek": 108174, "start": 4638.74, "end": 4643.34, "text": " So if you have harder problems and smaller models,", "tokens": [51214, 407, 498, 291, 362, 6081, 2740, 293, 4356, 5245, 11, 51444], "temperature": 0.0, "avg_logprob": -0.11667791537940503, "compression_ratio": 1.636734693877551, "no_speech_prob": 0.0003358350950293243}, {"id": 1285, "seek": 108174, "start": 4643.34, "end": 4647.1, "text": " then the thinking becomes more legitimate.", "tokens": [51444, 550, 264, 1953, 3643, 544, 17956, 13, 51632], "temperature": 0.0, "avg_logprob": -0.11667791537940503, "compression_ratio": 1.636734693877551, "no_speech_prob": 0.0003358350950293243}, {"id": 1286, "seek": 108174, "start": 4647.1, "end": 4650.54, "text": " It actually does need to do with chain of thought", "tokens": [51632, 467, 767, 775, 643, 281, 360, 365, 5021, 295, 1194, 51804], "temperature": 0.0, "avg_logprob": -0.11667791537940503, "compression_ratio": 1.636734693877551, "no_speech_prob": 0.0003358350950293243}, {"id": 1287, "seek": 111054, "start": 4650.54, "end": 4653.82, "text": " to get to a high confidence answer.", "tokens": [50364, 281, 483, 281, 257, 1090, 6687, 1867, 13, 50528], "temperature": 0.0, "avg_logprob": -0.140042839617264, "compression_ratio": 1.6183574879227054, "no_speech_prob": 0.004150567576289177}, {"id": 1288, "seek": 111054, "start": 4653.82, "end": 4656.78, "text": " It's actually kind of related or similar", "tokens": [50528, 467, 311, 767, 733, 295, 4077, 420, 2531, 50676], "temperature": 0.0, "avg_logprob": -0.140042839617264, "compression_ratio": 1.6183574879227054, "no_speech_prob": 0.004150567576289177}, {"id": 1289, "seek": 111054, "start": 4656.78, "end": 4662.58, "text": " to we discussed this deep thinking tokens idea a few weeks ago", "tokens": [50676, 281, 321, 7152, 341, 2452, 1953, 22667, 1558, 257, 1326, 3259, 2057, 50966], "temperature": 0.0, "avg_logprob": -0.140042839617264, "compression_ratio": 1.6183574879227054, "no_speech_prob": 0.004150567576289177}, {"id": 1290, "seek": 111054, "start": 4662.58, "end": 4666.22, "text": " where we were looking at if you track internal activations", "tokens": [50966, 689, 321, 645, 1237, 412, 498, 291, 2837, 6920, 2430, 763, 51148], "temperature": 0.0, "avg_logprob": -0.140042839617264, "compression_ratio": 1.6183574879227054, "no_speech_prob": 0.004150567576289177}, {"id": 1291, "seek": 111054, "start": 4666.22, "end": 4669.18, "text": " and look at things like deep thinking,", "tokens": [51148, 293, 574, 412, 721, 411, 2452, 1953, 11, 51296], "temperature": 0.0, "avg_logprob": -0.140042839617264, "compression_ratio": 1.6183574879227054, "no_speech_prob": 0.004150567576289177}, {"id": 1292, "seek": 111054, "start": 4669.18, "end": 4670.9400000000005, "text": " backtracking, certain behaviors", "tokens": [51296, 646, 6903, 14134, 11, 1629, 15501, 51384], "temperature": 0.0, "avg_logprob": -0.140042839617264, "compression_ratio": 1.6183574879227054, "no_speech_prob": 0.004150567576289177}, {"id": 1293, "seek": 111054, "start": 4670.9400000000005, "end": 4675.0599999999995, "text": " that indicate actual genuine kind of reasoning", "tokens": [51384, 300, 13330, 3539, 16699, 733, 295, 21577, 51590], "temperature": 0.0, "avg_logprob": -0.140042839617264, "compression_ratio": 1.6183574879227054, "no_speech_prob": 0.004150567576289177}, {"id": 1294, "seek": 111054, "start": 4675.0599999999995, "end": 4676.3, "text": " about the problem.", "tokens": [51590, 466, 264, 1154, 13, 51652], "temperature": 0.0, "avg_logprob": -0.140042839617264, "compression_ratio": 1.6183574879227054, "no_speech_prob": 0.004150567576289177}, {"id": 1295, "seek": 113630, "start": 4676.34, "end": 4680.82, "text": " You can find that these are very significant markers", "tokens": [50366, 509, 393, 915, 300, 613, 366, 588, 4776, 19175, 50590], "temperature": 0.0, "avg_logprob": -0.1659742367764314, "compression_ratio": 1.7132616487455197, "no_speech_prob": 0.00046114163706079125}, {"id": 1296, "seek": 113630, "start": 4680.82, "end": 4682.86, "text": " as to the confidence of the model", "tokens": [50590, 382, 281, 264, 6687, 295, 264, 2316, 50692], "temperature": 0.0, "avg_logprob": -0.1659742367764314, "compression_ratio": 1.7132616487455197, "no_speech_prob": 0.00046114163706079125}, {"id": 1297, "seek": 113630, "start": 4682.86, "end": 4685.98, "text": " at this point in reasoning about its answer.", "tokens": [50692, 412, 341, 935, 294, 21577, 466, 1080, 1867, 13, 50848], "temperature": 0.0, "avg_logprob": -0.1659742367764314, "compression_ratio": 1.7132616487455197, "no_speech_prob": 0.00046114163706079125}, {"id": 1298, "seek": 113630, "start": 4685.98, "end": 4689.58, "text": " So one applications of this is if you can know", "tokens": [50848, 407, 472, 5821, 295, 341, 307, 498, 291, 393, 458, 51028], "temperature": 0.0, "avg_logprob": -0.1659742367764314, "compression_ratio": 1.7132616487455197, "no_speech_prob": 0.00046114163706079125}, {"id": 1299, "seek": 113630, "start": 4689.58, "end": 4692.3, "text": " from internal activations, the confidence level,", "tokens": [51028, 490, 6920, 2430, 763, 11, 264, 6687, 1496, 11, 51164], "temperature": 0.0, "avg_logprob": -0.1659742367764314, "compression_ratio": 1.7132616487455197, "no_speech_prob": 0.00046114163706079125}, {"id": 1300, "seek": 113630, "start": 4692.3, "end": 4694.54, "text": " you can do what's called early exiting,", "tokens": [51164, 291, 393, 360, 437, 311, 1219, 2440, 48868, 11, 51276], "temperature": 0.0, "avg_logprob": -0.1659742367764314, "compression_ratio": 1.7132616487455197, "no_speech_prob": 0.00046114163706079125}, {"id": 1301, "seek": 113630, "start": 4694.54, "end": 4697.54, "text": " cut off the reasoning and just get to answer quicker.", "tokens": [51276, 1723, 766, 264, 21577, 293, 445, 483, 281, 1867, 16255, 13, 51426], "temperature": 0.0, "avg_logprob": -0.1659742367764314, "compression_ratio": 1.7132616487455197, "no_speech_prob": 0.00046114163706079125}, {"id": 1302, "seek": 113630, "start": 4697.54, "end": 4700.7, "text": " Yeah, another kind of pointer to that being the case here.", "tokens": [51426, 865, 11, 1071, 733, 295, 23918, 281, 300, 885, 264, 1389, 510, 13, 51584], "temperature": 0.0, "avg_logprob": -0.1659742367764314, "compression_ratio": 1.7132616487455197, "no_speech_prob": 0.00046114163706079125}, {"id": 1303, "seek": 113630, "start": 4700.7, "end": 4702.82, "text": " And to your point, so there's kind of like three different", "tokens": [51584, 400, 281, 428, 935, 11, 370, 456, 311, 733, 295, 411, 1045, 819, 51690], "temperature": 0.0, "avg_logprob": -0.1659742367764314, "compression_ratio": 1.7132616487455197, "no_speech_prob": 0.00046114163706079125}, {"id": 1304, "seek": 113630, "start": 4702.82, "end": 4704.78, "text": " methods that they'll use to figure out", "tokens": [51690, 7150, 300, 436, 603, 764, 281, 2573, 484, 51788], "temperature": 0.0, "avg_logprob": -0.1659742367764314, "compression_ratio": 1.7132616487455197, "no_speech_prob": 0.00046114163706079125}, {"id": 1305, "seek": 116478, "start": 4705.18, "end": 4706.26, "text": " because the central question here is like,", "tokens": [50384, 570, 264, 5777, 1168, 510, 307, 411, 11, 50438], "temperature": 0.0, "avg_logprob": -0.1848198555629043, "compression_ratio": 1.8215488215488216, "no_speech_prob": 0.0005226637003943324}, {"id": 1306, "seek": 116478, "start": 4706.26, "end": 4709.02, "text": " what does my model actually believe at any given point", "tokens": [50438, 437, 775, 452, 2316, 767, 1697, 412, 604, 2212, 935, 50576], "temperature": 0.0, "avg_logprob": -0.1848198555629043, "compression_ratio": 1.8215488215488216, "no_speech_prob": 0.0005226637003943324}, {"id": 1307, "seek": 116478, "start": 4709.02, "end": 4710.0599999999995, "text": " during reasoning?", "tokens": [50576, 1830, 21577, 30, 50628], "temperature": 0.0, "avg_logprob": -0.1848198555629043, "compression_ratio": 1.8215488215488216, "no_speech_prob": 0.0005226637003943324}, {"id": 1308, "seek": 116478, "start": 4710.0599999999995, "end": 4712.22, "text": " So one is this idea of attention probes.", "tokens": [50628, 407, 472, 307, 341, 1558, 295, 3202, 1239, 279, 13, 50736], "temperature": 0.0, "avg_logprob": -0.1848198555629043, "compression_ratio": 1.8215488215488216, "no_speech_prob": 0.0005226637003943324}, {"id": 1309, "seek": 116478, "start": 4712.22, "end": 4715.02, "text": " So you basically have these classifiers,", "tokens": [50736, 407, 291, 1936, 362, 613, 1508, 23463, 11, 50876], "temperature": 0.0, "avg_logprob": -0.1848198555629043, "compression_ratio": 1.8215488215488216, "no_speech_prob": 0.0005226637003943324}, {"id": 1310, "seek": 116478, "start": 4715.02, "end": 4717.98, "text": " there's like these linear models or very simple models", "tokens": [50876, 456, 311, 411, 613, 8213, 5245, 420, 588, 2199, 5245, 51024], "temperature": 0.0, "avg_logprob": -0.1848198555629043, "compression_ratio": 1.8215488215488216, "no_speech_prob": 0.0005226637003943324}, {"id": 1311, "seek": 116478, "start": 4717.98, "end": 4721.02, "text": " that you plug into any given layer of the residual stream", "tokens": [51024, 300, 291, 5452, 666, 604, 2212, 4583, 295, 264, 27980, 4309, 51176], "temperature": 0.0, "avg_logprob": -0.1848198555629043, "compression_ratio": 1.8215488215488216, "no_speech_prob": 0.0005226637003943324}, {"id": 1312, "seek": 116478, "start": 4721.02, "end": 4724.02, "text": " and you go, okay, I want you to predict like what a classifier", "tokens": [51176, 293, 291, 352, 11, 1392, 11, 286, 528, 291, 281, 6069, 411, 437, 257, 1508, 9902, 51326], "temperature": 0.0, "avg_logprob": -0.1848198555629043, "compression_ratio": 1.8215488215488216, "no_speech_prob": 0.0005226637003943324}, {"id": 1313, "seek": 116478, "start": 4724.02, "end": 4726.26, "text": " like, what is the final output going to be", "tokens": [51326, 411, 11, 437, 307, 264, 2572, 5598, 516, 281, 312, 51438], "temperature": 0.0, "avg_logprob": -0.1848198555629043, "compression_ratio": 1.8215488215488216, "no_speech_prob": 0.0005226637003943324}, {"id": 1314, "seek": 116478, "start": 4726.26, "end": 4727.34, "text": " at any given moment?", "tokens": [51438, 412, 604, 2212, 1623, 30, 51492], "temperature": 0.0, "avg_logprob": -0.1848198555629043, "compression_ratio": 1.8215488215488216, "no_speech_prob": 0.0005226637003943324}, {"id": 1315, "seek": 116478, "start": 4727.34, "end": 4729.98, "text": " So you basically, this is looking layer wise", "tokens": [51492, 407, 291, 1936, 11, 341, 307, 1237, 4583, 10829, 51624], "temperature": 0.0, "avg_logprob": -0.1848198555629043, "compression_ratio": 1.8215488215488216, "no_speech_prob": 0.0005226637003943324}, {"id": 1316, "seek": 116478, "start": 4729.98, "end": 4734.02, "text": " and interrupting the model in mid sort of token processing", "tokens": [51624, 293, 49455, 264, 2316, 294, 2062, 1333, 295, 14862, 9007, 51826], "temperature": 0.0, "avg_logprob": -0.1848198555629043, "compression_ratio": 1.8215488215488216, "no_speech_prob": 0.0005226637003943324}, {"id": 1317, "seek": 0, "start": 4732.88, "end": 4735.72, "text": " token processing for a given token.", "tokens": [51008, 14862, 9007, 337, 257, 2212, 14862, 13, 51150], "temperature": 0.0, "avg_logprob": -0.20355757911290442, "compression_ratio": 1.8176100628930818, "no_speech_prob": 0.11623518168926239}, {"id": 1318, "seek": 0, "start": 4735.72, "end": 4739.64, "text": " But then there's separately this idea of forced answering that you just alluded to essentially", "tokens": [51150, 583, 550, 456, 311, 14759, 341, 1558, 295, 7579, 13430, 300, 291, 445, 33919, 281, 4476, 51346], "temperature": 0.0, "avg_logprob": -0.20355757911290442, "compression_ratio": 1.8176100628930818, "no_speech_prob": 0.11623518168926239}, {"id": 1319, "seek": 0, "start": 4739.64, "end": 4743.6, "text": " interrupt the model mid reasoning, like in the middle of a chain of thought and say, okay,", "tokens": [51346, 12729, 264, 2316, 2062, 21577, 11, 411, 294, 264, 2808, 295, 257, 5021, 295, 1194, 293, 584, 11, 1392, 11, 51544], "temperature": 0.0, "avg_logprob": -0.20355757911290442, "compression_ratio": 1.8176100628930818, "no_speech_prob": 0.11623518168926239}, {"id": 1320, "seek": 0, "start": 4743.6, "end": 4745.36, "text": " now I want you to commit your answer.", "tokens": [51544, 586, 286, 528, 291, 281, 5599, 428, 1867, 13, 51632], "temperature": 0.0, "avg_logprob": -0.20355757911290442, "compression_ratio": 1.8176100628930818, "no_speech_prob": 0.11623518168926239}, {"id": 1321, "seek": 0, "start": 4745.36, "end": 4748.64, "text": " And then the last one is just like a chain of thought monitor, wait for the model to produce", "tokens": [51632, 400, 550, 264, 1036, 472, 307, 445, 411, 257, 5021, 295, 1194, 6002, 11, 1699, 337, 264, 2316, 281, 5258, 51796], "temperature": 0.0, "avg_logprob": -0.20355757911290442, "compression_ratio": 1.8176100628930818, "no_speech_prob": 0.11623518168926239}, {"id": 1322, "seek": 2864, "start": 4748.64, "end": 4753.6, "text": " the entire chain of thought, and then figure out like, you know, where in the text final", "tokens": [50364, 264, 2302, 5021, 295, 1194, 11, 293, 550, 2573, 484, 411, 11, 291, 458, 11, 689, 294, 264, 2487, 2572, 50612], "temperature": 0.0, "avg_logprob": -0.1645585239166394, "compression_ratio": 1.7108843537414966, "no_speech_prob": 0.0016342519083991647}, {"id": 1323, "seek": 2864, "start": 4753.6, "end": 4755.12, "text": " answer is implied.", "tokens": [50612, 1867, 307, 32614, 13, 50688], "temperature": 0.0, "avg_logprob": -0.1645585239166394, "compression_ratio": 1.7108843537414966, "no_speech_prob": 0.0016342519083991647}, {"id": 1324, "seek": 2864, "start": 4755.12, "end": 4759.24, "text": " So at what point could the model have just offered and said, okay, you know, this is it.", "tokens": [50688, 407, 412, 437, 935, 727, 264, 2316, 362, 445, 8059, 293, 848, 11, 1392, 11, 291, 458, 11, 341, 307, 309, 13, 50894], "temperature": 0.0, "avg_logprob": -0.1645585239166394, "compression_ratio": 1.7108843537414966, "no_speech_prob": 0.0016342519083991647}, {"id": 1325, "seek": 2864, "start": 4759.24, "end": 4761.24, "text": " And they compare all three of these.", "tokens": [50894, 400, 436, 6794, 439, 1045, 295, 613, 13, 50994], "temperature": 0.0, "avg_logprob": -0.1645585239166394, "compression_ratio": 1.7108843537414966, "no_speech_prob": 0.0016342519083991647}, {"id": 1326, "seek": 2864, "start": 4761.24, "end": 4765.12, "text": " And by noticing the differences between them, they can start to make some pretty interesting", "tokens": [50994, 400, 538, 21814, 264, 7300, 1296, 552, 11, 436, 393, 722, 281, 652, 512, 1238, 1880, 51188], "temperature": 0.0, "avg_logprob": -0.1645585239166394, "compression_ratio": 1.7108843537414966, "no_speech_prob": 0.0016342519083991647}, {"id": 1327, "seek": 2864, "start": 4765.12, "end": 4766.12, "text": " conclusions.", "tokens": [51188, 22865, 13, 51238], "temperature": 0.0, "avg_logprob": -0.1645585239166394, "compression_ratio": 1.7108843537414966, "no_speech_prob": 0.0016342519083991647}, {"id": 1328, "seek": 2864, "start": 4766.12, "end": 4771.24, "text": " I mean, you talked about this idea of sort of easy tasks leading to more, they call it performative", "tokens": [51238, 286, 914, 11, 291, 2825, 466, 341, 1558, 295, 1333, 295, 1858, 9608, 5775, 281, 544, 11, 436, 818, 309, 2042, 1166, 51494], "temperature": 0.0, "avg_logprob": -0.1645585239166394, "compression_ratio": 1.7108843537414966, "no_speech_prob": 0.0016342519083991647}, {"id": 1329, "seek": 2864, "start": 4771.24, "end": 4772.24, "text": " reasoning, right?", "tokens": [51494, 21577, 11, 558, 30, 51544], "temperature": 0.0, "avg_logprob": -0.1645585239166394, "compression_ratio": 1.7108843537414966, "no_speech_prob": 0.0016342519083991647}, {"id": 1330, "seek": 2864, "start": 4772.24, "end": 4774.96, "text": " So the model's pretending to reason about it.", "tokens": [51544, 407, 264, 2316, 311, 22106, 281, 1778, 466, 309, 13, 51680], "temperature": 0.0, "avg_logprob": -0.1645585239166394, "compression_ratio": 1.7108843537414966, "no_speech_prob": 0.0016342519083991647}, {"id": 1331, "seek": 5496, "start": 4774.96, "end": 4779.4, "text": " Or, you know, if you're pretending to dress up a problem like it's harder than it is.", "tokens": [50364, 1610, 11, 291, 458, 11, 498, 291, 434, 22106, 281, 5231, 493, 257, 1154, 411, 309, 311, 6081, 813, 309, 307, 13, 50586], "temperature": 0.0, "avg_logprob": -0.17985313653480262, "compression_ratio": 1.7348993288590604, "no_speech_prob": 0.031038109213113785}, {"id": 1332, "seek": 5496, "start": 4779.4, "end": 4784.88, "text": " So on MMLU questions, you find consistently that the attention probe, which is again,", "tokens": [50586, 407, 322, 376, 12683, 52, 1651, 11, 291, 915, 14961, 300, 264, 3202, 22715, 11, 597, 307, 797, 11, 50860], "temperature": 0.0, "avg_logprob": -0.17985313653480262, "compression_ratio": 1.7348993288590604, "no_speech_prob": 0.031038109213113785}, {"id": 1333, "seek": 5496, "start": 4784.88, "end": 4788.84, "text": " that layer wise, kind of looking at a given layer, plugging in a classifier to just look", "tokens": [50860, 300, 4583, 10829, 11, 733, 295, 1237, 412, 257, 2212, 4583, 11, 42975, 294, 257, 1508, 9902, 281, 445, 574, 51058], "temperature": 0.0, "avg_logprob": -0.17985313653480262, "compression_ratio": 1.7348993288590604, "no_speech_prob": 0.031038109213113785}, {"id": 1334, "seek": 5496, "start": 4788.84, "end": 4791.6, "text": " at the activations there and then making that prediction.", "tokens": [51058, 412, 264, 2430, 763, 456, 293, 550, 1455, 300, 17630, 13, 51196], "temperature": 0.0, "avg_logprob": -0.17985313653480262, "compression_ratio": 1.7348993288590604, "no_speech_prob": 0.031038109213113785}, {"id": 1335, "seek": 5496, "start": 4791.6, "end": 4796.36, "text": " So the attention probe could decode the correct final answer from the model's internals", "tokens": [51196, 407, 264, 3202, 22715, 727, 979, 1429, 264, 3006, 2572, 1867, 490, 264, 2316, 311, 2154, 1124, 51434], "temperature": 0.0, "avg_logprob": -0.17985313653480262, "compression_ratio": 1.7348993288590604, "no_speech_prob": 0.031038109213113785}, {"id": 1336, "seek": 5496, "start": 4796.36, "end": 4798.2, "text": " really early on, right?", "tokens": [51434, 534, 2440, 322, 11, 558, 30, 51526], "temperature": 0.0, "avg_logprob": -0.17985313653480262, "compression_ratio": 1.7348993288590604, "no_speech_prob": 0.031038109213113785}, {"id": 1337, "seek": 5496, "start": 4798.2, "end": 4802.88, "text": " Like sometimes right after reading the question, like straight one shotting it, right?", "tokens": [51526, 1743, 2171, 558, 934, 3760, 264, 1168, 11, 411, 2997, 472, 3347, 783, 309, 11, 558, 30, 51760], "temperature": 0.0, "avg_logprob": -0.17985313653480262, "compression_ratio": 1.7348993288590604, "no_speech_prob": 0.031038109213113785}, {"id": 1338, "seek": 8288, "start": 4802.88, "end": 4806.76, "text": " Whereas the chain of thought monitor sees no answer indicated in the text yet.", "tokens": [50364, 13813, 264, 5021, 295, 1194, 6002, 8194, 572, 1867, 16176, 294, 264, 2487, 1939, 13, 50558], "temperature": 0.0, "avg_logprob": -0.12778414479384187, "compression_ratio": 1.7521367521367521, "no_speech_prob": 0.3024522662162781}, {"id": 1339, "seek": 8288, "start": 4806.76, "end": 4810.2, "text": " So the model essentially knows the answer immediately, but it keeps thinking out loud", "tokens": [50558, 407, 264, 2316, 4476, 3255, 264, 1867, 4258, 11, 457, 309, 5965, 1953, 484, 6588, 50730], "temperature": 0.0, "avg_logprob": -0.12778414479384187, "compression_ratio": 1.7521367521367521, "no_speech_prob": 0.3024522662162781}, {"id": 1340, "seek": 8288, "start": 4810.2, "end": 4811.2, "text": " anyway.", "tokens": [50730, 4033, 13, 50780], "temperature": 0.0, "avg_logprob": -0.12778414479384187, "compression_ratio": 1.7521367521367521, "no_speech_prob": 0.3024522662162781}, {"id": 1341, "seek": 8288, "start": 4811.2, "end": 4814.4, "text": " And that's kind of where, you know, forced answering is also going to give you good", "tokens": [50780, 400, 300, 311, 733, 295, 689, 11, 291, 458, 11, 7579, 13430, 307, 611, 516, 281, 976, 291, 665, 50940], "temperature": 0.0, "avg_logprob": -0.12778414479384187, "compression_ratio": 1.7521367521367521, "no_speech_prob": 0.3024522662162781}, {"id": 1342, "seek": 8288, "start": 4814.4, "end": 4816.12, "text": " accuracy early on and all that.", "tokens": [50940, 14170, 2440, 322, 293, 439, 300, 13, 51026], "temperature": 0.0, "avg_logprob": -0.12778414479384187, "compression_ratio": 1.7521367521367521, "no_speech_prob": 0.3024522662162781}, {"id": 1343, "seek": 8288, "start": 4816.12, "end": 4818.76, "text": " That's an indication of what they'll call performative reasoning.", "tokens": [51026, 663, 311, 364, 18877, 295, 437, 436, 603, 818, 2042, 1166, 21577, 13, 51158], "temperature": 0.0, "avg_logprob": -0.12778414479384187, "compression_ratio": 1.7521367521367521, "no_speech_prob": 0.3024522662162781}, {"id": 1344, "seek": 8288, "start": 4818.76, "end": 4823.24, "text": " The fact that hard questions involve genuine reasoning more often, especially with small", "tokens": [51158, 440, 1186, 300, 1152, 1651, 9494, 16699, 21577, 544, 2049, 11, 2318, 365, 1359, 51382], "temperature": 0.0, "avg_logprob": -0.12778414479384187, "compression_ratio": 1.7521367521367521, "no_speech_prob": 0.3024522662162781}, {"id": 1345, "seek": 8288, "start": 4823.24, "end": 4827.52, "text": " models, we've actually seen tons of examples of that on the podcast before.", "tokens": [51382, 5245, 11, 321, 600, 767, 1612, 9131, 295, 5110, 295, 300, 322, 264, 7367, 949, 13, 51596], "temperature": 0.0, "avg_logprob": -0.12778414479384187, "compression_ratio": 1.7521367521367521, "no_speech_prob": 0.3024522662162781}, {"id": 1346, "seek": 8288, "start": 4827.52, "end": 4832.04, "text": " The way that I tend to think of it at least is the bigger the mismatch between the capabilities", "tokens": [51596, 440, 636, 300, 286, 3928, 281, 519, 295, 309, 412, 1935, 307, 264, 3801, 264, 23220, 852, 1296, 264, 10862, 51822], "temperature": 0.0, "avg_logprob": -0.12778414479384187, "compression_ratio": 1.7521367521367521, "no_speech_prob": 0.3024522662162781}, {"id": 1347, "seek": 11204, "start": 4832.04, "end": 4835.76, "text": " of your model or the amount of test time compute or whatever, the capabilities, let's", "tokens": [50364, 295, 428, 2316, 420, 264, 2372, 295, 1500, 565, 14722, 420, 2035, 11, 264, 10862, 11, 718, 311, 50550], "temperature": 0.0, "avg_logprob": -0.16283319270716304, "compression_ratio": 1.7593984962406015, "no_speech_prob": 0.004187033977359533}, {"id": 1348, "seek": 11204, "start": 4835.76, "end": 4839.4, "text": " say, of your system and of your compute budget taking together.", "tokens": [50550, 584, 11, 295, 428, 1185, 293, 295, 428, 14722, 4706, 1940, 1214, 13, 50732], "temperature": 0.0, "avg_logprob": -0.16283319270716304, "compression_ratio": 1.7593984962406015, "no_speech_prob": 0.004187033977359533}, {"id": 1349, "seek": 11204, "start": 4839.4, "end": 4843.96, "text": " And the actual difficulty of the problem, the more opportunity there is for the model to", "tokens": [50732, 400, 264, 3539, 10360, 295, 264, 1154, 11, 264, 544, 2650, 456, 307, 337, 264, 2316, 281, 50960], "temperature": 0.0, "avg_logprob": -0.16283319270716304, "compression_ratio": 1.7593984962406015, "no_speech_prob": 0.004187033977359533}, {"id": 1350, "seek": 11204, "start": 4843.96, "end": 4850.24, "text": " use the extra kind of capability overhang or compute budget that it has to plot or scheme", "tokens": [50960, 764, 264, 2857, 733, 295, 13759, 670, 23850, 420, 14722, 4706, 300, 309, 575, 281, 7542, 420, 12232, 51274], "temperature": 0.0, "avg_logprob": -0.16283319270716304, "compression_ratio": 1.7593984962406015, "no_speech_prob": 0.004187033977359533}, {"id": 1351, "seek": 11204, "start": 4850.24, "end": 4853.44, "text": " or do other things besides what you want it to do.", "tokens": [51274, 420, 360, 661, 721, 11868, 437, 291, 528, 309, 281, 360, 13, 51434], "temperature": 0.0, "avg_logprob": -0.16283319270716304, "compression_ratio": 1.7593984962406015, "no_speech_prob": 0.004187033977359533}, {"id": 1352, "seek": 11204, "start": 4853.44, "end": 4857.12, "text": " And so I said it, I think the last podcast, I'll say it again, really important research", "tokens": [51434, 400, 370, 286, 848, 309, 11, 286, 519, 264, 1036, 7367, 11, 286, 603, 584, 309, 797, 11, 534, 1021, 2132, 51618], "temperature": 0.0, "avg_logprob": -0.16283319270716304, "compression_ratio": 1.7593984962406015, "no_speech_prob": 0.004187033977359533}, {"id": 1353, "seek": 13712, "start": 4857.12, "end": 4864.96, "text": " direction would be, to my mind, how do we better index our appetite or compute budgets", "tokens": [50364, 3513, 576, 312, 11, 281, 452, 1575, 11, 577, 360, 321, 1101, 8186, 527, 23996, 420, 14722, 26708, 50756], "temperature": 0.0, "avg_logprob": -0.15469580859813875, "compression_ratio": 1.7351778656126482, "no_speech_prob": 0.12691067159175873}, {"id": 1354, "seek": 13712, "start": 4864.96, "end": 4871.08, "text": " or our model selection, model capability selection to match models and capabilities to problems", "tokens": [50756, 420, 527, 2316, 9450, 11, 2316, 13759, 9450, 281, 2995, 5245, 293, 10862, 281, 2740, 51062], "temperature": 0.0, "avg_logprob": -0.15469580859813875, "compression_ratio": 1.7351778656126482, "no_speech_prob": 0.12691067159175873}, {"id": 1355, "seek": 13712, "start": 4871.08, "end": 4873.56, "text": " such that they're kind of closely matched economically.", "tokens": [51062, 1270, 300, 436, 434, 733, 295, 8185, 21447, 26811, 13, 51186], "temperature": 0.0, "avg_logprob": -0.15469580859813875, "compression_ratio": 1.7351778656126482, "no_speech_prob": 0.12691067159175873}, {"id": 1356, "seek": 13712, "start": 4873.56, "end": 4874.56, "text": " You want this anyway, right?", "tokens": [51186, 509, 528, 341, 4033, 11, 558, 30, 51236], "temperature": 0.0, "avg_logprob": -0.15469580859813875, "compression_ratio": 1.7351778656126482, "no_speech_prob": 0.12691067159175873}, {"id": 1357, "seek": 13712, "start": 4874.56, "end": 4878.96, "text": " You don't want to use a model that's too big or too expensive to solve an easy problem,", "tokens": [51236, 509, 500, 380, 528, 281, 764, 257, 2316, 300, 311, 886, 955, 420, 886, 5124, 281, 5039, 364, 1858, 1154, 11, 51456], "temperature": 0.0, "avg_logprob": -0.15469580859813875, "compression_ratio": 1.7351778656126482, "no_speech_prob": 0.12691067159175873}, {"id": 1358, "seek": 13712, "start": 4878.96, "end": 4883.12, "text": " but there's also a safety imperative to this now where you want to try to take away", "tokens": [51456, 457, 456, 311, 611, 257, 4514, 32490, 281, 341, 586, 689, 291, 528, 281, 853, 281, 747, 1314, 51664], "temperature": 0.0, "avg_logprob": -0.15469580859813875, "compression_ratio": 1.7351778656126482, "no_speech_prob": 0.12691067159175873}, {"id": 1359, "seek": 16312, "start": 4883.12, "end": 4887.68, "text": " as much as you can, the excess compute excess capability that otherwise goes into kind", "tokens": [50364, 382, 709, 382, 291, 393, 11, 264, 9310, 14722, 9310, 13759, 300, 5911, 1709, 666, 733, 50592], "temperature": 0.0, "avg_logprob": -0.18441537193868351, "compression_ratio": 1.6299559471365639, "no_speech_prob": 0.10829818248748779}, {"id": 1360, "seek": 16312, "start": 4887.68, "end": 4890.56, "text": " of these undesirable effects.", "tokens": [50592, 295, 613, 45667, 21493, 5065, 13, 50736], "temperature": 0.0, "avg_logprob": -0.18441537193868351, "compression_ratio": 1.6299559471365639, "no_speech_prob": 0.10829818248748779}, {"id": 1361, "seek": 16312, "start": 4890.56, "end": 4896.32, "text": " Next up in training defenses against emergent misalignment in language models.", "tokens": [50736, 3087, 493, 294, 3097, 35989, 1970, 4345, 6930, 3346, 304, 41134, 294, 2856, 5245, 13, 51024], "temperature": 0.0, "avg_logprob": -0.18441537193868351, "compression_ratio": 1.6299559471365639, "no_speech_prob": 0.10829818248748779}, {"id": 1362, "seek": 16312, "start": 4896.32, "end": 4901.24, "text": " So emergent misalignment is a phenomenon we've discussed previously where you train your", "tokens": [51024, 407, 4345, 6930, 3346, 304, 41134, 307, 257, 14029, 321, 600, 7152, 8046, 689, 291, 3847, 428, 51270], "temperature": 0.0, "avg_logprob": -0.18441537193868351, "compression_ratio": 1.6299559471365639, "no_speech_prob": 0.10829818248748779}, {"id": 1363, "seek": 16312, "start": 4901.24, "end": 4908.28, "text": " model, you fine tune it slightly on a small set of stuff that's bad, like you make it", "tokens": [51270, 2316, 11, 291, 2489, 10864, 309, 4748, 322, 257, 1359, 992, 295, 1507, 300, 311, 1578, 11, 411, 291, 652, 309, 51622], "temperature": 0.0, "avg_logprob": -0.18441537193868351, "compression_ratio": 1.6299559471365639, "no_speech_prob": 0.10829818248748779}, {"id": 1364, "seek": 18828, "start": 4908.28, "end": 4914.4, "text": " right bad code, and then it turns out that that broadly makes it bad and all sorts of", "tokens": [50364, 558, 1578, 3089, 11, 293, 550, 309, 4523, 484, 300, 300, 19511, 1669, 309, 1578, 293, 439, 7527, 295, 50670], "temperature": 0.0, "avg_logprob": -0.23544612864858097, "compression_ratio": 1.6160337552742616, "no_speech_prob": 0.01656157337129116}, {"id": 1365, "seek": 18828, "start": 4914.4, "end": 4917.2, "text": " ways and able to get misaligned.", "tokens": [50670, 2098, 293, 1075, 281, 483, 3346, 304, 16690, 13, 50810], "temperature": 0.0, "avg_logprob": -0.23544612864858097, "compression_ratio": 1.6160337552742616, "no_speech_prob": 0.01656157337129116}, {"id": 1366, "seek": 18828, "start": 4917.2, "end": 4922.84, "text": " And so what the in training defenses here is talking about is let's say open AI or", "tokens": [50810, 400, 370, 437, 264, 294, 3097, 35989, 510, 307, 1417, 466, 307, 718, 311, 584, 1269, 7318, 420, 51092], "temperature": 0.0, "avg_logprob": -0.23544612864858097, "compression_ratio": 1.6160337552742616, "no_speech_prob": 0.01656157337129116}, {"id": 1367, "seek": 18828, "start": 4922.84, "end": 4929.36, "text": " ontropic do give a fine tuning API and allow people to fine tune their models on small", "tokens": [51092, 6592, 39173, 360, 976, 257, 2489, 15164, 9362, 293, 2089, 561, 281, 2489, 10864, 641, 5245, 322, 1359, 51418], "temperature": 0.0, "avg_logprob": -0.23544612864858097, "compression_ratio": 1.6160337552742616, "no_speech_prob": 0.01656157337129116}, {"id": 1368, "seek": 18828, "start": 4929.36, "end": 4930.36, "text": " datasets.", "tokens": [51418, 42856, 13, 51468], "temperature": 0.0, "avg_logprob": -0.23544612864858097, "compression_ratio": 1.6160337552742616, "no_speech_prob": 0.01656157337129116}, {"id": 1369, "seek": 18828, "start": 4930.36, "end": 4935.44, "text": " Well, you would then introduce this potential for broad misalignment by just a small", "tokens": [51468, 1042, 11, 291, 576, 550, 5366, 341, 3995, 337, 4152, 3346, 304, 41134, 538, 445, 257, 1359, 51722], "temperature": 0.0, "avg_logprob": -0.23544612864858097, "compression_ratio": 1.6160337552742616, "no_speech_prob": 0.01656157337129116}, {"id": 1370, "seek": 21544, "start": 4935.52, "end": 4937.28, "text": " about a fine tuning.", "tokens": [50368, 466, 257, 2489, 15164, 13, 50456], "temperature": 0.0, "avg_logprob": -0.2625241413545073, "compression_ratio": 1.6493506493506493, "no_speech_prob": 0.013836904428899288}, {"id": 1371, "seek": 21544, "start": 4937.28, "end": 4944.4, "text": " And I look at several different ways to try to defend against it and prevent it from happening.", "tokens": [50456, 400, 286, 574, 412, 2940, 819, 2098, 281, 853, 281, 8602, 1970, 309, 293, 4871, 309, 490, 2737, 13, 50812], "temperature": 0.0, "avg_logprob": -0.2625241413545073, "compression_ratio": 1.6493506493506493, "no_speech_prob": 0.013836904428899288}, {"id": 1372, "seek": 21544, "start": 4944.4, "end": 4951.16, "text": " So we look at regularizing to have a penalty to stray too far from a good model, there's", "tokens": [50812, 407, 321, 574, 412, 3890, 3319, 281, 362, 257, 16263, 281, 36219, 886, 1400, 490, 257, 665, 2316, 11, 456, 311, 51150], "temperature": 0.0, "avg_logprob": -0.2625241413545073, "compression_ratio": 1.6493506493506493, "no_speech_prob": 0.013836904428899288}, {"id": 1373, "seek": 21544, "start": 4951.16, "end": 4956.96, "text": " steering, and the method that they ultimately include works your best is interleaving the", "tokens": [51150, 14823, 11, 293, 264, 3170, 300, 436, 6284, 4090, 1985, 428, 1151, 307, 728, 306, 6152, 264, 51440], "temperature": 0.0, "avg_logprob": -0.2625241413545073, "compression_ratio": 1.6493506493506493, "no_speech_prob": 0.013836904428899288}, {"id": 1374, "seek": 21544, "start": 4956.96, "end": 4962.8, "text": " training examples from a general abstract tuning data set from a general like this is", "tokens": [51440, 3097, 5110, 490, 257, 2674, 12649, 15164, 1412, 992, 490, 257, 2674, 411, 341, 307, 51732], "temperature": 0.0, "avg_logprob": -0.2625241413545073, "compression_ratio": 1.6493506493506493, "no_speech_prob": 0.013836904428899288}, {"id": 1375, "seek": 24280, "start": 4962.8, "end": 4967.64, "text": " how we should behave that a set with the fine tuning that a set and they find that is", "tokens": [50364, 577, 321, 820, 15158, 300, 257, 992, 365, 264, 2489, 15164, 300, 257, 992, 293, 436, 915, 300, 307, 50606], "temperature": 0.0, "avg_logprob": -0.2658947969046165, "compression_ratio": 1.8218181818181818, "no_speech_prob": 0.04870082065463066}, {"id": 1376, "seek": 24280, "start": 4967.64, "end": 4972.64, "text": " the most effective for defending against fine tuning misalignment.", "tokens": [50606, 264, 881, 4942, 337, 21377, 1970, 2489, 15164, 3346, 304, 41134, 13, 50856], "temperature": 0.0, "avg_logprob": -0.2658947969046165, "compression_ratio": 1.8218181818181818, "no_speech_prob": 0.04870082065463066}, {"id": 1377, "seek": 24280, "start": 4972.64, "end": 4977.44, "text": " Yeah, this is kind of an interesting paper I mixed feelings on emergent misalignment", "tokens": [50856, 865, 11, 341, 307, 733, 295, 364, 1880, 3035, 286, 7467, 6640, 322, 4345, 6930, 3346, 304, 41134, 51096], "temperature": 0.0, "avg_logprob": -0.2658947969046165, "compression_ratio": 1.8218181818181818, "no_speech_prob": 0.04870082065463066}, {"id": 1378, "seek": 24280, "start": 4977.44, "end": 4982.04, "text": " because it I think it may not be a problem it may almost be a good thing in a weird way.", "tokens": [51096, 570, 309, 286, 519, 309, 815, 406, 312, 257, 1154, 309, 815, 1920, 312, 257, 665, 551, 294, 257, 3657, 636, 13, 51326], "temperature": 0.0, "avg_logprob": -0.2658947969046165, "compression_ratio": 1.8218181818181818, "no_speech_prob": 0.04870082065463066}, {"id": 1379, "seek": 24280, "start": 4982.04, "end": 4986.52, "text": " It's a little bit weird to focus with regards to fine tuning because you could have like", "tokens": [51326, 467, 311, 257, 707, 857, 3657, 281, 1879, 365, 14258, 281, 2489, 15164, 570, 291, 727, 362, 411, 51550], "temperature": 0.0, "avg_logprob": -0.2658947969046165, "compression_ratio": 1.8218181818181818, "no_speech_prob": 0.04870082065463066}, {"id": 1380, "seek": 24280, "start": 4986.52, "end": 4992.56, "text": " intentional misalignment, why do you, you know, right, well, no, that's a good point.", "tokens": [51550, 21935, 3346, 304, 41134, 11, 983, 360, 291, 11, 291, 458, 11, 558, 11, 731, 11, 572, 11, 300, 311, 257, 665, 935, 13, 51852], "temperature": 0.0, "avg_logprob": -0.2658947969046165, "compression_ratio": 1.8218181818181818, "no_speech_prob": 0.04870082065463066}, {"id": 1381, "seek": 27256, "start": 4992.56, "end": 4997.08, "text": " And I think I think the frame is something like emergent misalignment is leads to these", "tokens": [50364, 400, 286, 519, 286, 519, 264, 3920, 307, 746, 411, 4345, 6930, 3346, 304, 41134, 307, 6689, 281, 613, 50590], "temperature": 0.0, "avg_logprob": -0.15818304011202233, "compression_ratio": 1.7525083612040133, "no_speech_prob": 0.0019262430723756552}, {"id": 1382, "seek": 27256, "start": 4997.08, "end": 4998.92, "text": " surprising knock on effects.", "tokens": [50590, 8830, 6728, 322, 5065, 13, 50682], "temperature": 0.0, "avg_logprob": -0.15818304011202233, "compression_ratio": 1.7525083612040133, "no_speech_prob": 0.0019262430723756552}, {"id": 1383, "seek": 27256, "start": 4998.92, "end": 5002.8, "text": " What it fundamentally means is if you can fine tune for one behavior and that leads to", "tokens": [50682, 708, 309, 17879, 1355, 307, 498, 291, 393, 2489, 10864, 337, 472, 5223, 293, 300, 6689, 281, 50876], "temperature": 0.0, "avg_logprob": -0.15818304011202233, "compression_ratio": 1.7525083612040133, "no_speech_prob": 0.0019262430723756552}, {"id": 1384, "seek": 27256, "start": 5002.8, "end": 5006.52, "text": " massive changes in other behaviors, then damn, we got to be really careful about fine", "tokens": [50876, 5994, 2962, 294, 661, 15501, 11, 550, 8151, 11, 321, 658, 281, 312, 534, 5026, 466, 2489, 51062], "temperature": 0.0, "avg_logprob": -0.15818304011202233, "compression_ratio": 1.7525083612040133, "no_speech_prob": 0.0019262430723756552}, {"id": 1385, "seek": 27256, "start": 5006.52, "end": 5011.48, "text": " tuning because we can't think of a model once fine tuned as in any way comparable from", "tokens": [51062, 15164, 570, 321, 393, 380, 519, 295, 257, 2316, 1564, 2489, 10870, 382, 294, 604, 636, 25323, 490, 51310], "temperature": 0.0, "avg_logprob": -0.15818304011202233, "compression_ratio": 1.7525083612040133, "no_speech_prob": 0.0019262430723756552}, {"id": 1386, "seek": 27256, "start": 5011.48, "end": 5013.32, "text": " a safety standpoint to the original model.", "tokens": [51310, 257, 4514, 15827, 281, 264, 3380, 2316, 13, 51402], "temperature": 0.0, "avg_logprob": -0.15818304011202233, "compression_ratio": 1.7525083612040133, "no_speech_prob": 0.0019262430723756552}, {"id": 1387, "seek": 27256, "start": 5013.32, "end": 5016.4, "text": " So any safety valves we've run on the original model no longer apply.", "tokens": [51402, 407, 604, 4514, 34950, 321, 600, 1190, 322, 264, 3380, 2316, 572, 2854, 3079, 13, 51556], "temperature": 0.0, "avg_logprob": -0.15818304011202233, "compression_ratio": 1.7525083612040133, "no_speech_prob": 0.0019262430723756552}, {"id": 1388, "seek": 27256, "start": 5016.4, "end": 5017.68, "text": " I think that's kind of part of it.", "tokens": [51556, 286, 519, 300, 311, 733, 295, 644, 295, 309, 13, 51620], "temperature": 0.0, "avg_logprob": -0.15818304011202233, "compression_ratio": 1.7525083612040133, "no_speech_prob": 0.0019262430723756552}, {"id": 1389, "seek": 29768, "start": 5017.68, "end": 5022.64, "text": " I think, you know, this is a good example of the kind of persona theory that anthropics", "tokens": [50364, 286, 519, 11, 291, 458, 11, 341, 307, 257, 665, 1365, 295, 264, 733, 295, 12184, 5261, 300, 22727, 1167, 50612], "temperature": 0.0, "avg_logprob": -0.12166742091419316, "compression_ratio": 1.8293650793650793, "no_speech_prob": 0.005502928048372269}, {"id": 1390, "seek": 29768, "start": 5022.64, "end": 5027.52, "text": " been kind of pushing lately where you're fine tuning a model to write and secure code.", "tokens": [50612, 668, 733, 295, 7380, 12881, 689, 291, 434, 2489, 15164, 257, 2316, 281, 2464, 293, 7144, 3089, 13, 50856], "temperature": 0.0, "avg_logprob": -0.12166742091419316, "compression_ratio": 1.8293650793650793, "no_speech_prob": 0.005502928048372269}, {"id": 1391, "seek": 29768, "start": 5027.52, "end": 5031.12, "text": " The model kind of goes, oh, so I actually, it's not that I'm being trained just to write", "tokens": [50856, 440, 2316, 733, 295, 1709, 11, 1954, 11, 370, 286, 767, 11, 309, 311, 406, 300, 286, 478, 885, 8895, 445, 281, 2464, 51036], "temperature": 0.0, "avg_logprob": -0.12166742091419316, "compression_ratio": 1.8293650793650793, "no_speech_prob": 0.005502928048372269}, {"id": 1392, "seek": 29768, "start": 5031.12, "end": 5032.12, "text": " and secure code.", "tokens": [51036, 293, 7144, 3089, 13, 51086], "temperature": 0.0, "avg_logprob": -0.12166742091419316, "compression_ratio": 1.8293650793650793, "no_speech_prob": 0.005502928048372269}, {"id": 1393, "seek": 29768, "start": 5032.12, "end": 5038.08, "text": " I'm being asked to activate my misaligned persona in a more general sense, which means that", "tokens": [51086, 286, 478, 885, 2351, 281, 13615, 452, 3346, 304, 16690, 12184, 294, 257, 544, 2674, 2020, 11, 597, 1355, 300, 51384], "temperature": 0.0, "avg_logprob": -0.12166742091419316, "compression_ratio": 1.8293650793650793, "no_speech_prob": 0.005502928048372269}, {"id": 1394, "seek": 29768, "start": 5038.08, "end": 5042.84, "text": " there is a notion of a misalignment persona in the model in the first place, which means", "tokens": [51384, 456, 307, 257, 10710, 295, 257, 3346, 304, 41134, 12184, 294, 264, 2316, 294, 264, 700, 1081, 11, 597, 1355, 51622], "temperature": 0.0, "avg_logprob": -0.12166742091419316, "compression_ratio": 1.8293650793650793, "no_speech_prob": 0.005502928048372269}, {"id": 1395, "seek": 32284, "start": 5042.84, "end": 5047.92, "text": " that there is an out of distribution generalization, kind of understanding of what alignment and", "tokens": [50364, 300, 456, 307, 364, 484, 295, 7316, 2674, 2144, 11, 733, 295, 3701, 295, 437, 18515, 293, 50618], "temperature": 0.0, "avg_logprob": -0.19406335719493256, "compression_ratio": 1.721311475409836, "no_speech_prob": 0.032431721687316895}, {"id": 1396, "seek": 32284, "start": 5047.92, "end": 5051.16, "text": " safety is, which seems like it could actually be a good thing.", "tokens": [50618, 4514, 307, 11, 597, 2544, 411, 309, 727, 767, 312, 257, 665, 551, 13, 50780], "temperature": 0.0, "avg_logprob": -0.19406335719493256, "compression_ratio": 1.721311475409836, "no_speech_prob": 0.032431721687316895}, {"id": 1397, "seek": 32284, "start": 5051.16, "end": 5055.28, "text": " That was kind of the rabbit hole that I was trying to avoid, but that's the general", "tokens": [50780, 663, 390, 733, 295, 264, 19509, 5458, 300, 286, 390, 1382, 281, 5042, 11, 457, 300, 311, 264, 2674, 50986], "temperature": 0.0, "avg_logprob": -0.19406335719493256, "compression_ratio": 1.721311475409836, "no_speech_prob": 0.032431721687316895}, {"id": 1398, "seek": 32284, "start": 5055.28, "end": 5056.28, "text": " direction.", "tokens": [50986, 3513, 13, 51036], "temperature": 0.0, "avg_logprob": -0.19406335719493256, "compression_ratio": 1.721311475409836, "no_speech_prob": 0.032431721687316895}, {"id": 1399, "seek": 32284, "start": 5056.28, "end": 5063.08, "text": " One quick observation on this move on, but basically the two defenses that I found the relative", "tokens": [51036, 1485, 1702, 14816, 322, 341, 1286, 322, 11, 457, 1936, 264, 732, 35989, 300, 286, 1352, 264, 4972, 51376], "temperature": 0.0, "avg_logprob": -0.19406335719493256, "compression_ratio": 1.721311475409836, "no_speech_prob": 0.032431721687316895}, {"id": 1400, "seek": 32284, "start": 5063.08, "end": 5066.4400000000005, "text": " performance pretty surprising, neither one of these was the most performant one.", "tokens": [51376, 3389, 1238, 8830, 11, 9662, 472, 295, 613, 390, 264, 881, 2042, 394, 472, 13, 51544], "temperature": 0.0, "avg_logprob": -0.19406335719493256, "compression_ratio": 1.721311475409836, "no_speech_prob": 0.032431721687316895}, {"id": 1401, "seek": 32284, "start": 5066.4400000000005, "end": 5070.84, "text": " So they use this kale divergence regularization one where they're going to penalize the model", "tokens": [51544, 407, 436, 764, 341, 34699, 47387, 3890, 2144, 472, 689, 436, 434, 516, 281, 13661, 1125, 264, 2316, 51764], "temperature": 0.0, "avg_logprob": -0.19406335719493256, "compression_ratio": 1.721311475409836, "no_speech_prob": 0.032431721687316895}, {"id": 1402, "seek": 35084, "start": 5070.84, "end": 5074.96, "text": " for drifting too far from the original line model during training, right?", "tokens": [50364, 337, 37973, 886, 1400, 490, 264, 3380, 1622, 2316, 1830, 3097, 11, 558, 30, 50570], "temperature": 0.0, "avg_logprob": -0.152665042569713, "compression_ratio": 1.711340206185567, "no_speech_prob": 0.034023091197013855}, {"id": 1403, "seek": 35084, "start": 5074.96, "end": 5080.32, "text": " So I'm going to fine tune you to write in secure code, but if you drift too far on your", "tokens": [50570, 407, 286, 478, 516, 281, 2489, 10864, 291, 281, 2464, 294, 7144, 3089, 11, 457, 498, 291, 19699, 886, 1400, 322, 428, 50838], "temperature": 0.0, "avg_logprob": -0.152665042569713, "compression_ratio": 1.711340206185567, "no_speech_prob": 0.034023091197013855}, {"id": 1404, "seek": 35084, "start": 5080.32, "end": 5084.4, "text": " outputs and kind of other domains as well, I'm going to prevent you from doing that by forcing", "tokens": [50838, 23930, 293, 733, 295, 661, 25514, 382, 731, 11, 286, 478, 516, 281, 4871, 291, 490, 884, 300, 538, 19030, 51042], "temperature": 0.0, "avg_logprob": -0.152665042569713, "compression_ratio": 1.711340206185567, "no_speech_prob": 0.034023091197013855}, {"id": 1405, "seek": 35084, "start": 5084.4, "end": 5086.8, "text": " your outputs to be at least somewhat similar.", "tokens": [51042, 428, 23930, 281, 312, 412, 1935, 8344, 2531, 13, 51162], "temperature": 0.0, "avg_logprob": -0.152665042569713, "compression_ratio": 1.711340206185567, "no_speech_prob": 0.034023091197013855}, {"id": 1406, "seek": 35084, "start": 5086.8, "end": 5087.8, "text": " So that's one.", "tokens": [51162, 407, 300, 311, 472, 13, 51212], "temperature": 0.0, "avg_logprob": -0.152665042569713, "compression_ratio": 1.711340206185567, "no_speech_prob": 0.034023091197013855}, {"id": 1407, "seek": 35084, "start": 5087.8, "end": 5093.28, "text": " The other one was a similar thing that applies, it's LDIFS feature space regularization.", "tokens": [51212, 440, 661, 472, 390, 257, 2531, 551, 300, 13165, 11, 309, 311, 441, 3085, 29318, 4111, 1901, 3890, 2144, 13, 51486], "temperature": 0.0, "avg_logprob": -0.152665042569713, "compression_ratio": 1.711340206185567, "no_speech_prob": 0.034023091197013855}, {"id": 1408, "seek": 35084, "start": 5093.28, "end": 5097.4, "text": " It's like basically the same thing, but for the activations of the model, roughly speaking.", "tokens": [51486, 467, 311, 411, 1936, 264, 912, 551, 11, 457, 337, 264, 2430, 763, 295, 264, 2316, 11, 9810, 4124, 13, 51692], "temperature": 0.0, "avg_logprob": -0.152665042569713, "compression_ratio": 1.711340206185567, "no_speech_prob": 0.034023091197013855}, {"id": 1409, "seek": 37740, "start": 5097.4, "end": 5102.8, "text": " So instead of penalizing drifts in the outputs, the actual text that it writes, it penalizes", "tokens": [50364, 407, 2602, 295, 13661, 3319, 1224, 8065, 294, 264, 23930, 11, 264, 3539, 2487, 300, 309, 13657, 11, 309, 13661, 5660, 50634], "temperature": 0.0, "avg_logprob": -0.14021726960027722, "compression_ratio": 1.8344155844155845, "no_speech_prob": 0.006997319869697094}, {"id": 1410, "seek": 37740, "start": 5102.8, "end": 5107.88, "text": " drifting too far in terms of the model's internal activations rather than the outputs.", "tokens": [50634, 37973, 886, 1400, 294, 2115, 295, 264, 2316, 311, 6920, 2430, 763, 2831, 813, 264, 23930, 13, 50888], "temperature": 0.0, "avg_logprob": -0.14021726960027722, "compression_ratio": 1.8344155844155845, "no_speech_prob": 0.006997319869697094}, {"id": 1411, "seek": 37740, "start": 5107.88, "end": 5111.36, "text": " And this one, I would have expected it to actually be more effective because in some way,", "tokens": [50888, 400, 341, 472, 11, 286, 576, 362, 5176, 309, 281, 767, 312, 544, 4942, 570, 294, 512, 636, 11, 51062], "temperature": 0.0, "avg_logprob": -0.14021726960027722, "compression_ratio": 1.8344155844155845, "no_speech_prob": 0.006997319869697094}, {"id": 1412, "seek": 37740, "start": 5111.36, "end": 5112.36, "text": " it's more fundamental.", "tokens": [51062, 309, 311, 544, 8088, 13, 51112], "temperature": 0.0, "avg_logprob": -0.14021726960027722, "compression_ratio": 1.8344155844155845, "no_speech_prob": 0.006997319869697094}, {"id": 1413, "seek": 37740, "start": 5112.36, "end": 5115.72, "text": " It feels like you're getting at the actual guts of the model.", "tokens": [51112, 467, 3417, 411, 291, 434, 1242, 412, 264, 3539, 28560, 295, 264, 2316, 13, 51280], "temperature": 0.0, "avg_logprob": -0.14021726960027722, "compression_ratio": 1.8344155844155845, "no_speech_prob": 0.006997319869697094}, {"id": 1414, "seek": 37740, "start": 5115.72, "end": 5120.0, "text": " It doesn't work as well though, as simple text-based regularization.", "tokens": [51280, 467, 1177, 380, 589, 382, 731, 1673, 11, 382, 2199, 2487, 12, 6032, 3890, 2144, 13, 51494], "temperature": 0.0, "avg_logprob": -0.14021726960027722, "compression_ratio": 1.8344155844155845, "no_speech_prob": 0.006997319869697094}, {"id": 1415, "seek": 37740, "start": 5120.0, "end": 5123.0, "text": " Let's make sure that the outputs as red look better.", "tokens": [51494, 961, 311, 652, 988, 300, 264, 23930, 382, 2182, 574, 1101, 13, 51644], "temperature": 0.0, "avg_logprob": -0.14021726960027722, "compression_ratio": 1.8344155844155845, "no_speech_prob": 0.006997319869697094}, {"id": 1416, "seek": 37740, "start": 5123.0, "end": 5126.52, "text": " And the other thing about this, I think part of this is like the text-based outputs more", "tokens": [51644, 400, 264, 661, 551, 466, 341, 11, 286, 519, 644, 295, 341, 307, 411, 264, 2487, 12, 6032, 23930, 544, 51820], "temperature": 0.0, "avg_logprob": -0.14021726960027722, "compression_ratio": 1.8344155844155845, "no_speech_prob": 0.006997319869697094}, {"id": 1417, "seek": 40652, "start": 5126.52, "end": 5129.44, "text": " directly tied to the behavior itself that you care about.", "tokens": [50364, 3838, 9601, 281, 264, 5223, 2564, 300, 291, 1127, 466, 13, 50510], "temperature": 0.0, "avg_logprob": -0.16583750480847642, "compression_ratio": 1.7349397590361446, "no_speech_prob": 0.042473528534173965}, {"id": 1418, "seek": 40652, "start": 5129.44, "end": 5133.32, "text": " But the other piece is, it's possibly just selected the wrong layers to focus on.", "tokens": [50510, 583, 264, 661, 2522, 307, 11, 309, 311, 6264, 445, 8209, 264, 2085, 7914, 281, 1879, 322, 13, 50704], "temperature": 0.0, "avg_logprob": -0.16583750480847642, "compression_ratio": 1.7349397590361446, "no_speech_prob": 0.042473528534173965}, {"id": 1419, "seek": 40652, "start": 5133.32, "end": 5137.2, "text": " They looked at like every fifth layer, just to save memory, but that might not be the", "tokens": [50704, 814, 2956, 412, 411, 633, 9266, 4583, 11, 445, 281, 3155, 4675, 11, 457, 300, 1062, 406, 312, 264, 50898], "temperature": 0.0, "avg_logprob": -0.16583750480847642, "compression_ratio": 1.7349397590361446, "no_speech_prob": 0.042473528534173965}, {"id": 1420, "seek": 40652, "start": 5137.2, "end": 5138.2, "text": " right kind of play.", "tokens": [50898, 558, 733, 295, 862, 13, 50948], "temperature": 0.0, "avg_logprob": -0.16583750480847642, "compression_ratio": 1.7349397590361446, "no_speech_prob": 0.042473528534173965}, {"id": 1421, "seek": 40652, "start": 5138.2, "end": 5140.08, "text": " And then also, I guess, activation spaces.", "tokens": [50948, 400, 550, 611, 11, 286, 2041, 11, 24433, 7673, 13, 51042], "temperature": 0.0, "avg_logprob": -0.16583750480847642, "compression_ratio": 1.7349397590361446, "no_speech_prob": 0.042473528534173965}, {"id": 1422, "seek": 40652, "start": 5140.08, "end": 5144.92, "text": " There's a lot of redundancies in it, so you may not be hitting all of the kind of representations", "tokens": [51042, 821, 311, 257, 688, 295, 27830, 32286, 294, 309, 11, 370, 291, 815, 406, 312, 8850, 439, 295, 264, 733, 295, 33358, 51284], "temperature": 0.0, "avg_logprob": -0.16583750480847642, "compression_ratio": 1.7349397590361446, "no_speech_prob": 0.042473528534173965}, {"id": 1423, "seek": 40652, "start": 5144.92, "end": 5145.92, "text": " that matter.", "tokens": [51284, 300, 1871, 13, 51334], "temperature": 0.0, "avg_logprob": -0.16583750480847642, "compression_ratio": 1.7349397590361446, "no_speech_prob": 0.042473528534173965}, {"id": 1424, "seek": 40652, "start": 5145.92, "end": 5150.68, "text": " Bottom line is, this is quite interesting and we just keep surfacing, surprising findings", "tokens": [51334, 38289, 1622, 307, 11, 341, 307, 1596, 1880, 293, 321, 445, 1066, 9684, 5615, 11, 8830, 16483, 51572], "temperature": 0.0, "avg_logprob": -0.16583750480847642, "compression_ratio": 1.7349397590361446, "no_speech_prob": 0.042473528534173965}, {"id": 1425, "seek": 40652, "start": 5150.68, "end": 5151.68, "text": " in alignment.", "tokens": [51572, 294, 18515, 13, 51622], "temperature": 0.0, "avg_logprob": -0.16583750480847642, "compression_ratio": 1.7349397590361446, "no_speech_prob": 0.042473528534173965}, {"id": 1426, "seek": 40652, "start": 5151.68, "end": 5154.04, "text": " And that's not a great thing, but it's also interesting.", "tokens": [51622, 400, 300, 311, 406, 257, 869, 551, 11, 457, 309, 311, 611, 1880, 13, 51740], "temperature": 0.0, "avg_logprob": -0.16583750480847642, "compression_ratio": 1.7349397590361446, "no_speech_prob": 0.042473528534173965}, {"id": 1427, "seek": 40652, "start": 5154.04, "end": 5155.32, "text": " So there we go.", "tokens": [51740, 407, 456, 321, 352, 13, 51804], "temperature": 0.0, "avg_logprob": -0.16583750480847642, "compression_ratio": 1.7349397590361446, "no_speech_prob": 0.042473528534173965}, {"id": 1428, "seek": 43532, "start": 5155.32, "end": 5160.52, "text": " And I think it used to be the case that there's like people who care about alignment and", "tokens": [50364, 400, 286, 519, 309, 1143, 281, 312, 264, 1389, 300, 456, 311, 411, 561, 567, 1127, 466, 18515, 293, 50624], "temperature": 0.0, "avg_logprob": -0.2562495163508824, "compression_ratio": 1.538152610441767, "no_speech_prob": 0.021355966106057167}, {"id": 1429, "seek": 43532, "start": 5160.52, "end": 5162.04, "text": " people who don't.", "tokens": [50624, 561, 567, 500, 380, 13, 50700], "temperature": 0.0, "avg_logprob": -0.2562495163508824, "compression_ratio": 1.538152610441767, "no_speech_prob": 0.021355966106057167}, {"id": 1430, "seek": 43532, "start": 5162.04, "end": 5166.04, "text": " And that is starting to change a little bit.", "tokens": [50700, 400, 300, 307, 2891, 281, 1319, 257, 707, 857, 13, 50900], "temperature": 0.0, "avg_logprob": -0.2562495163508824, "compression_ratio": 1.538152610441767, "no_speech_prob": 0.021355966106057167}, {"id": 1431, "seek": 43532, "start": 5166.04, "end": 5167.04, "text": " Yeah.", "tokens": [50900, 865, 13, 50950], "temperature": 0.0, "avg_logprob": -0.2562495163508824, "compression_ratio": 1.538152610441767, "no_speech_prob": 0.021355966106057167}, {"id": 1432, "seek": 43532, "start": 5167.04, "end": 5168.04, "text": " Yeah.", "tokens": [50950, 865, 13, 51000], "temperature": 0.0, "avg_logprob": -0.2562495163508824, "compression_ratio": 1.538152610441767, "no_speech_prob": 0.021355966106057167}, {"id": 1433, "seek": 43532, "start": 5168.04, "end": 5174.84, "text": " And next up, a more sort of programmatic, real world study, how do frontier AI agents", "tokens": [51000, 400, 958, 493, 11, 257, 544, 1333, 295, 1461, 25915, 11, 957, 1002, 2979, 11, 577, 360, 35853, 7318, 12554, 51340], "temperature": 0.0, "avg_logprob": -0.2562495163508824, "compression_ratio": 1.538152610441767, "no_speech_prob": 0.021355966106057167}, {"id": 1434, "seek": 43532, "start": 5174.84, "end": 5178.64, "text": " perform in multi-step cyber-attack scenarios?", "tokens": [51340, 2042, 294, 4825, 12, 16792, 13411, 12, 44514, 15077, 30, 51530], "temperature": 0.0, "avg_logprob": -0.2562495163508824, "compression_ratio": 1.538152610441767, "no_speech_prob": 0.021355966106057167}, {"id": 1435, "seek": 43532, "start": 5178.64, "end": 5183.88, "text": " This builds on something we discussed, I think, last week, this is from the AI Security", "tokens": [51530, 639, 15182, 322, 746, 321, 7152, 11, 286, 519, 11, 1036, 1243, 11, 341, 307, 490, 264, 7318, 11164, 51792], "temperature": 0.0, "avg_logprob": -0.2562495163508824, "compression_ratio": 1.538152610441767, "no_speech_prob": 0.021355966106057167}, {"id": 1436, "seek": 46388, "start": 5183.88, "end": 5184.88, "text": " Institute.", "tokens": [50364, 9446, 13, 50414], "temperature": 0.0, "avg_logprob": -0.21658666619976746, "compression_ratio": 1.6142857142857143, "no_speech_prob": 0.020289085805416107}, {"id": 1437, "seek": 46388, "start": 5184.88, "end": 5189.92, "text": " Previously, I showed that, as you scale test-time compute to 100 million tokens, models get", "tokens": [50414, 33606, 11, 286, 4712, 300, 11, 382, 291, 4373, 1500, 12, 3766, 14722, 281, 2319, 2459, 22667, 11, 5245, 483, 50666], "temperature": 0.0, "avg_logprob": -0.21658666619976746, "compression_ratio": 1.6142857142857143, "no_speech_prob": 0.020289085805416107}, {"id": 1438, "seek": 46388, "start": 5189.92, "end": 5191.68, "text": " better and better and better.", "tokens": [50666, 1101, 293, 1101, 293, 1101, 13, 50754], "temperature": 0.0, "avg_logprob": -0.21658666619976746, "compression_ratio": 1.6142857142857143, "no_speech_prob": 0.020289085805416107}, {"id": 1439, "seek": 46388, "start": 5191.68, "end": 5193.28, "text": " This is effectively building on that.", "tokens": [50754, 639, 307, 8659, 2390, 322, 300, 13, 50834], "temperature": 0.0, "avg_logprob": -0.21658666619976746, "compression_ratio": 1.6142857142857143, "no_speech_prob": 0.020289085805416107}, {"id": 1440, "seek": 46388, "start": 5193.28, "end": 5200.12, "text": " They released a paper where they measure the capabilities via what they call cyber ranges,", "tokens": [50834, 814, 4736, 257, 3035, 689, 436, 3481, 264, 10862, 5766, 437, 436, 818, 13411, 22526, 11, 51176], "temperature": 0.0, "avg_logprob": -0.21658666619976746, "compression_ratio": 1.6142857142857143, "no_speech_prob": 0.020289085805416107}, {"id": 1441, "seek": 46388, "start": 5200.12, "end": 5206.56, "text": " which are simulated network conditions within which you would execute an attack without", "tokens": [51176, 597, 366, 41713, 3209, 4487, 1951, 597, 291, 576, 14483, 364, 2690, 1553, 51498], "temperature": 0.0, "avg_logprob": -0.21658666619976746, "compression_ratio": 1.6142857142857143, "no_speech_prob": 0.020289085805416107}, {"id": 1442, "seek": 46388, "start": 5206.56, "end": 5208.56, "text": " too many controls, without defenses.", "tokens": [51498, 886, 867, 9003, 11, 1553, 35989, 13, 51598], "temperature": 0.0, "avg_logprob": -0.21658666619976746, "compression_ratio": 1.6142857142857143, "no_speech_prob": 0.020289085805416107}, {"id": 1443, "seek": 46388, "start": 5208.56, "end": 5212.44, "text": " You just need to execute a set of steps to accomplish some goals.", "tokens": [51598, 509, 445, 643, 281, 14483, 257, 992, 295, 4439, 281, 9021, 512, 5493, 13, 51792], "temperature": 0.0, "avg_logprob": -0.21658666619976746, "compression_ratio": 1.6142857142857143, "no_speech_prob": 0.020289085805416107}, {"id": 1444, "seek": 49244, "start": 5212.44, "end": 5217.36, "text": " And they have multiple levels of completion, more or less.", "tokens": [50364, 400, 436, 362, 3866, 4358, 295, 19372, 11, 544, 420, 1570, 13, 50610], "temperature": 0.0, "avg_logprob": -0.20514613657425612, "compression_ratio": 1.7704918032786885, "no_speech_prob": 0.0006552616832777858}, {"id": 1445, "seek": 49244, "start": 5217.36, "end": 5223.68, "text": " And so what they find is, perhaps I'm surprising, the models get better and better and better.", "tokens": [50610, 400, 370, 437, 436, 915, 307, 11, 4317, 286, 478, 8830, 11, 264, 5245, 483, 1101, 293, 1101, 293, 1101, 13, 50926], "temperature": 0.0, "avg_logprob": -0.20514613657425612, "compression_ratio": 1.7704918032786885, "no_speech_prob": 0.0006552616832777858}, {"id": 1446, "seek": 49244, "start": 5223.68, "end": 5230.6, "text": " And also, you get better and better and better as you get to more test-time compute.", "tokens": [50926, 400, 611, 11, 291, 483, 1101, 293, 1101, 293, 1101, 382, 291, 483, 281, 544, 1500, 12, 3766, 14722, 13, 51272], "temperature": 0.0, "avg_logprob": -0.20514613657425612, "compression_ratio": 1.7704918032786885, "no_speech_prob": 0.0006552616832777858}, {"id": 1447, "seek": 49244, "start": 5230.6, "end": 5236.72, "text": " So overall conclusion is, the capabilities, the cyber harm capabilities of the models", "tokens": [51272, 407, 4787, 10063, 307, 11, 264, 10862, 11, 264, 13411, 6491, 10862, 295, 264, 5245, 51578], "temperature": 0.0, "avg_logprob": -0.20514613657425612, "compression_ratio": 1.7704918032786885, "no_speech_prob": 0.0006552616832777858}, {"id": 1448, "seek": 51672, "start": 5236.72, "end": 5242.8, "text": " are improving rapidly and are starting to get to some fairly complex scenarios.", "tokens": [50364, 366, 11470, 12910, 293, 366, 2891, 281, 483, 281, 512, 6457, 3997, 15077, 13, 50668], "temperature": 0.0, "avg_logprob": -0.21018651987497622, "compression_ratio": 1.5384615384615385, "no_speech_prob": 0.16844195127487183}, {"id": 1449, "seek": 51672, "start": 5242.8, "end": 5246.72, "text": " First of all, a great paper by the UK AI Security Institute.", "tokens": [50668, 2386, 295, 439, 11, 257, 869, 3035, 538, 264, 7051, 7318, 11164, 9446, 13, 50864], "temperature": 0.0, "avg_logprob": -0.21018651987497622, "compression_ratio": 1.5384615384615385, "no_speech_prob": 0.16844195127487183}, {"id": 1450, "seek": 51672, "start": 5246.72, "end": 5251.84, "text": " So they've continued to pump out really high-quality stuff that feels very relevant to a whole", "tokens": [50864, 407, 436, 600, 7014, 281, 5889, 484, 534, 1090, 12, 11286, 1507, 300, 3417, 588, 7340, 281, 257, 1379, 51120], "temperature": 0.0, "avg_logprob": -0.21018651987497622, "compression_ratio": 1.5384615384615385, "no_speech_prob": 0.16844195127487183}, {"id": 1451, "seek": 51672, "start": 5251.84, "end": 5255.92, "text": " bunch of end-game scenarios around ASI, cyber is very important, at least the way I think", "tokens": [51120, 3840, 295, 917, 12, 15038, 15077, 926, 7469, 40, 11, 13411, 307, 588, 1021, 11, 412, 1935, 264, 636, 286, 519, 51324], "temperature": 0.0, "avg_logprob": -0.21018651987497622, "compression_ratio": 1.5384615384615385, "no_speech_prob": 0.16844195127487183}, {"id": 1452, "seek": 51672, "start": 5255.92, "end": 5258.76, "text": " about it, quite important in that picture.", "tokens": [51324, 466, 309, 11, 1596, 1021, 294, 300, 3036, 13, 51466], "temperature": 0.0, "avg_logprob": -0.21018651987497622, "compression_ratio": 1.5384615384615385, "no_speech_prob": 0.16844195127487183}, {"id": 1453, "seek": 51672, "start": 5258.76, "end": 5261.64, "text": " And so, well, two trends that they identify, right?", "tokens": [51466, 400, 370, 11, 731, 11, 732, 13892, 300, 436, 5876, 11, 558, 30, 51610], "temperature": 0.0, "avg_logprob": -0.21018651987497622, "compression_ratio": 1.5384615384615385, "no_speech_prob": 0.16844195127487183}, {"id": 1454, "seek": 54164, "start": 5261.64, "end": 5267.12, "text": " So the first is, each successive model generation outperforms its predecessor at fixed token", "tokens": [50364, 407, 264, 700, 307, 11, 1184, 48043, 2316, 5125, 484, 26765, 82, 1080, 34991, 412, 6806, 14862, 50638], "temperature": 0.0, "avg_logprob": -0.19664213198881883, "compression_ratio": 1.6440677966101696, "no_speech_prob": 0.06939338892698288}, {"id": 1455, "seek": 54164, "start": 5267.12, "end": 5268.12, "text": " budgets, right?", "tokens": [50638, 26708, 11, 558, 30, 50688], "temperature": 0.0, "avg_logprob": -0.19664213198881883, "compression_ratio": 1.6440677966101696, "no_speech_prob": 0.06939338892698288}, {"id": 1456, "seek": 54164, "start": 5268.12, "end": 5272.96, "text": " So for a given token budget, you're going to see better performance from Opus 4.6 than", "tokens": [50688, 407, 337, 257, 2212, 14862, 4706, 11, 291, 434, 516, 281, 536, 1101, 3389, 490, 12011, 301, 1017, 13, 21, 813, 50930], "temperature": 0.0, "avg_logprob": -0.19664213198881883, "compression_ratio": 1.6440677966101696, "no_speech_prob": 0.06939338892698288}, {"id": 1457, "seek": 54164, "start": 5272.96, "end": 5274.36, "text": " from Opus 4.5, right?", "tokens": [50930, 490, 12011, 301, 1017, 13, 20, 11, 558, 30, 51000], "temperature": 0.0, "avg_logprob": -0.19664213198881883, "compression_ratio": 1.6440677966101696, "no_speech_prob": 0.06939338892698288}, {"id": 1458, "seek": 54164, "start": 5274.36, "end": 5275.36, "text": " So they see that.", "tokens": [51000, 407, 436, 536, 300, 13, 51050], "temperature": 0.0, "avg_logprob": -0.19664213198881883, "compression_ratio": 1.6440677966101696, "no_speech_prob": 0.06939338892698288}, {"id": 1459, "seek": 54164, "start": 5275.36, "end": 5277.36, "text": " They also see performance improvements.", "tokens": [51050, 814, 611, 536, 3389, 13797, 13, 51150], "temperature": 0.0, "avg_logprob": -0.19664213198881883, "compression_ratio": 1.6440677966101696, "no_speech_prob": 0.06939338892698288}, {"id": 1460, "seek": 54164, "start": 5277.36, "end": 5282.0, "text": " Like we talked about this last week, pretty interesting, good paper, and we're threading", "tokens": [51150, 1743, 321, 2825, 466, 341, 1036, 1243, 11, 1238, 1880, 11, 665, 3035, 11, 293, 321, 434, 7207, 278, 51382], "temperature": 0.0, "avg_logprob": -0.19664213198881883, "compression_ratio": 1.6440677966101696, "no_speech_prob": 0.06939338892698288}, {"id": 1461, "seek": 54164, "start": 5282.0, "end": 5283.0, "text": " the descriptions.", "tokens": [51382, 264, 24406, 13, 51432], "temperature": 0.0, "avg_logprob": -0.19664213198881883, "compression_ratio": 1.6440677966101696, "no_speech_prob": 0.06939338892698288}, {"id": 1462, "seek": 54164, "start": 5283.0, "end": 5284.0, "text": " I won't do it here.", "tokens": [51432, 286, 1582, 380, 360, 309, 510, 13, 51482], "temperature": 0.0, "avg_logprob": -0.19664213198881883, "compression_ratio": 1.6440677966101696, "no_speech_prob": 0.06939338892698288}, {"id": 1463, "seek": 54164, "start": 5284.0, "end": 5287.6, "text": " But if you're curious of their cyber ranges, because this does go well beyond this", "tokens": [51482, 583, 498, 291, 434, 6369, 295, 641, 13411, 22526, 11, 570, 341, 775, 352, 731, 4399, 341, 51662], "temperature": 0.0, "avg_logprob": -0.19664213198881883, "compression_ratio": 1.6440677966101696, "no_speech_prob": 0.06939338892698288}, {"id": 1464, "seek": 56760, "start": 5287.6, "end": 5292.16, "text": " or traditional capture of the flag stuff that we've seen in sort of offensive cyber model", "tokens": [50364, 420, 5164, 7983, 295, 264, 7166, 1507, 300, 321, 600, 1612, 294, 1333, 295, 15710, 13411, 2316, 50592], "temperature": 0.0, "avg_logprob": -0.20097593923585605, "compression_ratio": 1.5609756097560976, "no_speech_prob": 0.008210964500904083}, {"id": 1465, "seek": 56760, "start": 5292.16, "end": 5296.76, "text": " eVALs, this is really like simulating a whole corporate network, a whole corporate environment,", "tokens": [50592, 308, 53, 3427, 82, 11, 341, 307, 534, 411, 1034, 12162, 257, 1379, 10896, 3209, 11, 257, 1379, 10896, 2823, 11, 50822], "temperature": 0.0, "avg_logprob": -0.20097593923585605, "compression_ratio": 1.5609756097560976, "no_speech_prob": 0.008210964500904083}, {"id": 1466, "seek": 56760, "start": 5296.76, "end": 5301.08, "text": " which is much closer to what actually makes a difference in the real world.", "tokens": [50822, 597, 307, 709, 4966, 281, 437, 767, 1669, 257, 2649, 294, 264, 957, 1002, 13, 51038], "temperature": 0.0, "avg_logprob": -0.20097593923585605, "compression_ratio": 1.5609756097560976, "no_speech_prob": 0.008210964500904083}, {"id": 1467, "seek": 56760, "start": 5301.08, "end": 5308.0, "text": " Next up, Eval Awareness in Cloud Opus 4.6's browser comp performances.", "tokens": [51038, 3087, 493, 11, 462, 3337, 43949, 1287, 294, 8061, 12011, 301, 1017, 13, 21, 311, 11185, 715, 16087, 13, 51384], "temperature": 0.0, "avg_logprob": -0.20097593923585605, "compression_ratio": 1.5609756097560976, "no_speech_prob": 0.008210964500904083}, {"id": 1468, "seek": 56760, "start": 5308.0, "end": 5312.68, "text": " This is a report from Unphropic, and this has been something that's been discussed a", "tokens": [51384, 639, 307, 257, 2275, 490, 1156, 950, 39173, 11, 293, 341, 575, 668, 746, 300, 311, 668, 7152, 257, 51618], "temperature": 0.0, "avg_logprob": -0.20097593923585605, "compression_ratio": 1.5609756097560976, "no_speech_prob": 0.008210964500904083}, {"id": 1469, "seek": 56760, "start": 5312.68, "end": 5314.8, "text": " little bit kind of on and off.", "tokens": [51618, 707, 857, 733, 295, 322, 293, 766, 13, 51724], "temperature": 0.0, "avg_logprob": -0.20097593923585605, "compression_ratio": 1.5609756097560976, "no_speech_prob": 0.008210964500904083}, {"id": 1470, "seek": 59480, "start": 5314.8, "end": 5321.8, "text": " So the Eval Awareness you referenced is one of the topics to be aware of with regards", "tokens": [50364, 407, 264, 462, 3337, 43949, 1287, 291, 32734, 307, 472, 295, 264, 8378, 281, 312, 3650, 295, 365, 14258, 50714], "temperature": 0.0, "avg_logprob": -0.16184755700065734, "compression_ratio": 1.610878661087866, "no_speech_prob": 0.09901582449674606}, {"id": 1471, "seek": 59480, "start": 5321.8, "end": 5327.8, "text": " to alignment safety, basically, if you're checking model for doing the right thing or not,", "tokens": [50714, 281, 18515, 4514, 11, 1936, 11, 498, 291, 434, 8568, 2316, 337, 884, 264, 558, 551, 420, 406, 11, 51014], "temperature": 0.0, "avg_logprob": -0.16184755700065734, "compression_ratio": 1.610878661087866, "no_speech_prob": 0.09901582449674606}, {"id": 1472, "seek": 59480, "start": 5327.8, "end": 5333.68, "text": " and it's aware of that, then it might behave differently when you're valuing it versus", "tokens": [51014, 293, 309, 311, 3650, 295, 300, 11, 550, 309, 1062, 15158, 7614, 562, 291, 434, 7332, 278, 309, 5717, 51308], "temperature": 0.0, "avg_logprob": -0.16184755700065734, "compression_ratio": 1.610878661087866, "no_speech_prob": 0.09901582449674606}, {"id": 1473, "seek": 59480, "start": 5333.68, "end": 5336.04, "text": " when it's actually out in a wild.", "tokens": [51308, 562, 309, 311, 767, 484, 294, 257, 4868, 13, 51426], "temperature": 0.0, "avg_logprob": -0.16184755700065734, "compression_ratio": 1.610878661087866, "no_speech_prob": 0.09901582449674606}, {"id": 1474, "seek": 59480, "start": 5336.04, "end": 5344.76, "text": " And so here they look at nine cases where leaked answers appeared in public web content", "tokens": [51426, 400, 370, 510, 436, 574, 412, 4949, 3331, 689, 31779, 6338, 8516, 294, 1908, 3670, 2701, 51862], "temperature": 0.0, "avg_logprob": -0.16184755700065734, "compression_ratio": 1.610878661087866, "no_speech_prob": 0.09901582449674606}, {"id": 1475, "seek": 62476, "start": 5344.76, "end": 5355.24, "text": " and found, among a 1,266 problems on browser comp, the model found 11 cases where answers", "tokens": [50364, 293, 1352, 11, 3654, 257, 502, 11, 10880, 21, 2740, 322, 11185, 715, 11, 264, 2316, 1352, 2975, 3331, 689, 6338, 50888], "temperature": 0.0, "avg_logprob": -0.25580659161011376, "compression_ratio": 1.4624277456647399, "no_speech_prob": 0.0032353545539081097}, {"id": 1476, "seek": 62476, "start": 5355.24, "end": 5363.0, "text": " came from benchmarks, materials, and yeah, basically often, or in some cases wound up reasoning", "tokens": [50888, 1361, 490, 43751, 11, 5319, 11, 293, 1338, 11, 1936, 2049, 11, 420, 294, 512, 3331, 10999, 493, 21577, 51276], "temperature": 0.0, "avg_logprob": -0.25580659161011376, "compression_ratio": 1.4624277456647399, "no_speech_prob": 0.0032353545539081097}, {"id": 1477, "seek": 62476, "start": 5363.0, "end": 5368.88, "text": " that the questions that it was dealing with were evaluation shaped.", "tokens": [51276, 300, 264, 1651, 300, 309, 390, 6260, 365, 645, 13344, 13475, 13, 51570], "temperature": 0.0, "avg_logprob": -0.25580659161011376, "compression_ratio": 1.4624277456647399, "no_speech_prob": 0.0032353545539081097}, {"id": 1478, "seek": 64888, "start": 5368.88, "end": 5375.28, "text": " And then having concluded that this is likely an evaluation, it looked up a benchmark.", "tokens": [50364, 400, 550, 1419, 22960, 300, 341, 307, 3700, 364, 13344, 11, 309, 2956, 493, 257, 18927, 13, 50684], "temperature": 0.0, "avg_logprob": -0.20442848765488827, "compression_ratio": 1.6804511278195489, "no_speech_prob": 0.02230248786509037}, {"id": 1479, "seek": 64888, "start": 5375.28, "end": 5380.68, "text": " So these are some examples where this can happen at least in practice already.", "tokens": [50684, 407, 613, 366, 512, 5110, 689, 341, 393, 1051, 412, 1935, 294, 3124, 1217, 13, 50954], "temperature": 0.0, "avg_logprob": -0.20442848765488827, "compression_ratio": 1.6804511278195489, "no_speech_prob": 0.02230248786509037}, {"id": 1480, "seek": 64888, "start": 5380.68, "end": 5381.68, "text": " Yeah.", "tokens": [50954, 865, 13, 51004], "temperature": 0.0, "avg_logprob": -0.20442848765488827, "compression_ratio": 1.6804511278195489, "no_speech_prob": 0.02230248786509037}, {"id": 1481, "seek": 64888, "start": 5381.68, "end": 5386.2, "text": " And the cases where you tend to see this happen according to the report is like, first", "tokens": [51004, 400, 264, 3331, 689, 291, 3928, 281, 536, 341, 1051, 4650, 281, 264, 2275, 307, 411, 11, 700, 51230], "temperature": 0.0, "avg_logprob": -0.20442848765488827, "compression_ratio": 1.6804511278195489, "no_speech_prob": 0.02230248786509037}, {"id": 1482, "seek": 64888, "start": 5386.2, "end": 5392.08, "text": " you'll see the model try like hundreds of normal search attempts and then conclude that", "tokens": [51230, 291, 603, 536, 264, 2316, 853, 411, 6779, 295, 2710, 3164, 15257, 293, 550, 16886, 300, 51524], "temperature": 0.0, "avg_logprob": -0.20442848765488827, "compression_ratio": 1.6804511278195489, "no_speech_prob": 0.02230248786509037}, {"id": 1483, "seek": 64888, "start": 5392.08, "end": 5397.24, "text": " the question just felt artificially constructed after not succeeding through those earlier", "tokens": [51524, 264, 1168, 445, 2762, 39905, 2270, 17083, 934, 406, 47912, 807, 729, 3071, 51782], "temperature": 0.0, "avg_logprob": -0.20442848765488827, "compression_ratio": 1.6804511278195489, "no_speech_prob": 0.02230248786509037}, {"id": 1484, "seek": 64888, "start": 5397.24, "end": 5398.24, "text": " attempts.", "tokens": [51782, 15257, 13, 51832], "temperature": 0.0, "avg_logprob": -0.20442848765488827, "compression_ratio": 1.6804511278195489, "no_speech_prob": 0.02230248786509037}, {"id": 1485, "seek": 67824, "start": 5398.24, "end": 5402.04, "text": " And it'll start to name and look for specific benchmarks.", "tokens": [50364, 400, 309, 603, 722, 281, 1315, 293, 574, 337, 2685, 43751, 13, 50554], "temperature": 0.0, "avg_logprob": -0.24617124825227457, "compression_ratio": 1.6824817518248176, "no_speech_prob": 0.01163510698825121}, {"id": 1486, "seek": 67824, "start": 5402.04, "end": 5406.12, "text": " So it'll be like, you'll see it look up, you know, Gaia, browser comp, or browser comp", "tokens": [50554, 407, 309, 603, 312, 411, 11, 291, 603, 536, 309, 574, 493, 11, 291, 458, 11, 10384, 654, 11, 11185, 715, 11, 420, 11185, 715, 50758], "temperature": 0.0, "avg_logprob": -0.24617124825227457, "compression_ratio": 1.6824817518248176, "no_speech_prob": 0.01163510698825121}, {"id": 1487, "seek": 67824, "start": 5406.12, "end": 5411.84, "text": " or like whatever, a whole bunch of different e-vals, then it'll find whichever the e-val is", "tokens": [50758, 420, 411, 2035, 11, 257, 1379, 3840, 295, 819, 308, 12, 85, 1124, 11, 550, 309, 603, 915, 24123, 264, 308, 12, 85, 304, 307, 51044], "temperature": 0.0, "avg_logprob": -0.24617124825227457, "compression_ratio": 1.6824817518248176, "no_speech_prob": 0.01163510698825121}, {"id": 1488, "seek": 67824, "start": 5411.84, "end": 5416.24, "text": " and it will typically look at the implementation like on GitHub or whatever and go from there.", "tokens": [51044, 293, 309, 486, 5850, 574, 412, 264, 11420, 411, 322, 23331, 420, 2035, 293, 352, 490, 456, 13, 51264], "temperature": 0.0, "avg_logprob": -0.24617124825227457, "compression_ratio": 1.6824817518248176, "no_speech_prob": 0.01163510698825121}, {"id": 1489, "seek": 67824, "start": 5416.24, "end": 5418.84, "text": " And so that's how a lot of this happened.", "tokens": [51264, 400, 370, 300, 311, 577, 257, 688, 295, 341, 2011, 13, 51394], "temperature": 0.0, "avg_logprob": -0.24617124825227457, "compression_ratio": 1.6824817518248176, "no_speech_prob": 0.01163510698825121}, {"id": 1490, "seek": 67824, "start": 5418.84, "end": 5422.6, "text": " One other weird thing that was highlighted on the inner web this week was that you have", "tokens": [51394, 1485, 661, 3657, 551, 300, 390, 17173, 322, 264, 7284, 3670, 341, 1243, 390, 300, 291, 362, 51582], "temperature": 0.0, "avg_logprob": -0.24617124825227457, "compression_ratio": 1.6824817518248176, "no_speech_prob": 0.01163510698825121}, {"id": 1491, "seek": 70260, "start": 5422.6, "end": 5429.4, "text": " some e-commerce sites that will auto generate these persistent pages or pages that stay", "tokens": [50364, 512, 308, 12, 26926, 7533, 300, 486, 8399, 8460, 613, 24315, 7183, 420, 7183, 300, 1754, 50704], "temperature": 0.0, "avg_logprob": -0.19337286364357426, "compression_ratio": 1.7692307692307692, "no_speech_prob": 0.032912589609622955}, {"id": 1492, "seek": 70260, "start": 5429.4, "end": 5435.28, "text": " up like sub pages that log the history of search queries on their website, even when", "tokens": [50704, 493, 411, 1422, 7183, 300, 3565, 264, 2503, 295, 3164, 24109, 322, 641, 3144, 11, 754, 562, 50998], "temperature": 0.0, "avg_logprob": -0.19337286364357426, "compression_ratio": 1.7692307692307692, "no_speech_prob": 0.032912589609622955}, {"id": 1493, "seek": 70260, "start": 5435.28, "end": 5436.92, "text": " there are no matching products.", "tokens": [50998, 456, 366, 572, 14324, 3383, 13, 51080], "temperature": 0.0, "avg_logprob": -0.19337286364357426, "compression_ratio": 1.7692307692307692, "no_speech_prob": 0.032912589609622955}, {"id": 1494, "seek": 70260, "start": 5436.92, "end": 5441.96, "text": " So if you look at like purple turtles on besock on like eBay or whatever e-commerce site,", "tokens": [51080, 407, 498, 291, 574, 412, 411, 9656, 32422, 322, 4097, 1560, 322, 411, 33803, 420, 2035, 308, 12, 26926, 3621, 11, 51332], "temperature": 0.0, "avg_logprob": -0.19337286364357426, "compression_ratio": 1.7692307692307692, "no_speech_prob": 0.032912589609622955}, {"id": 1495, "seek": 70260, "start": 5441.96, "end": 5446.4, "text": " that will lead to the creation of a persistent page somewhere on the internet.", "tokens": [51332, 300, 486, 1477, 281, 264, 8016, 295, 257, 24315, 3028, 4079, 322, 264, 4705, 13, 51554], "temperature": 0.0, "avg_logprob": -0.19337286364357426, "compression_ratio": 1.7692307692307692, "no_speech_prob": 0.032912589609622955}, {"id": 1496, "seek": 70260, "start": 5446.4, "end": 5452.28, "text": " And the effect is that every agent that runs browser comp slowly caches its queries as", "tokens": [51554, 400, 264, 1802, 307, 300, 633, 9461, 300, 6676, 11185, 715, 5692, 269, 13272, 1080, 24109, 382, 51848], "temperature": 0.0, "avg_logprob": -0.19337286364357426, "compression_ratio": 1.7692307692307692, "no_speech_prob": 0.032912589609622955}, {"id": 1497, "seek": 73228, "start": 5452.28, "end": 5454.48, "text": " these permanent indexed pages.", "tokens": [50364, 613, 10996, 8186, 292, 7183, 13, 50474], "temperature": 0.0, "avg_logprob": -0.12207426457178025, "compression_ratio": 1.7163120567375887, "no_speech_prob": 0.0008168316562660038}, {"id": 1498, "seek": 73228, "start": 5454.48, "end": 5460.4, "text": " So over time, like agents from different evaluation runs were inadvertently leaving traces", "tokens": [50474, 407, 670, 565, 11, 411, 12554, 490, 819, 13344, 6676, 645, 49152, 2276, 5012, 26076, 50770], "temperature": 0.0, "avg_logprob": -0.12207426457178025, "compression_ratio": 1.7163120567375887, "no_speech_prob": 0.0008168316562660038}, {"id": 1499, "seek": 73228, "start": 5460.4, "end": 5466.48, "text": " of their search strategies that could be seen by future agents, which is a one thing that's", "tokens": [50770, 295, 641, 3164, 9029, 300, 727, 312, 1612, 538, 2027, 12554, 11, 597, 307, 257, 472, 551, 300, 311, 51074], "temperature": 0.0, "avg_logprob": -0.12207426457178025, "compression_ratio": 1.7163120567375887, "no_speech_prob": 0.0008168316562660038}, {"id": 1500, "seek": 73228, "start": 5466.48, "end": 5470.96, "text": " reported in the context of this paper, maybe the broader implication is this is an interesting", "tokens": [51074, 7055, 294, 264, 4319, 295, 341, 3035, 11, 1310, 264, 13227, 37814, 307, 341, 307, 364, 1880, 51298], "temperature": 0.0, "avg_logprob": -0.12207426457178025, "compression_ratio": 1.7163120567375887, "no_speech_prob": 0.0008168316562660038}, {"id": 1501, "seek": 73228, "start": 5470.96, "end": 5476.36, "text": " way for agents to have a kind of persistent memory that transcends their instantiation", "tokens": [51298, 636, 337, 12554, 281, 362, 257, 733, 295, 24315, 4675, 300, 43800, 2581, 641, 9836, 6642, 51568], "temperature": 0.0, "avg_logprob": -0.12207426457178025, "compression_ratio": 1.7163120567375887, "no_speech_prob": 0.0008168316562660038}, {"id": 1502, "seek": 73228, "start": 5476.36, "end": 5481.36, "text": " in the moment, which you shouldn't over focus on this particular mechanism as a way that", "tokens": [51568, 294, 264, 1623, 11, 597, 291, 4659, 380, 670, 1879, 322, 341, 1729, 7513, 382, 257, 636, 300, 51818], "temperature": 0.0, "avg_logprob": -0.12207426457178025, "compression_ratio": 1.7163120567375887, "no_speech_prob": 0.0008168316562660038}, {"id": 1503, "seek": 76136, "start": 5481.36, "end": 5487.52, "text": " agents could either communicate with other agents or with their future selves in illicit", "tokens": [50364, 12554, 727, 2139, 7890, 365, 661, 12554, 420, 365, 641, 2027, 41900, 294, 3171, 8876, 50672], "temperature": 0.0, "avg_logprob": -0.16755429213329898, "compression_ratio": 1.6402640264026402, "no_speech_prob": 0.007812834344804287}, {"id": 1504, "seek": 76136, "start": 5487.52, "end": 5488.52, "text": " ways.", "tokens": [50672, 2098, 13, 50722], "temperature": 0.0, "avg_logprob": -0.16755429213329898, "compression_ratio": 1.6402640264026402, "no_speech_prob": 0.007812834344804287}, {"id": 1505, "seek": 76136, "start": 5488.52, "end": 5492.04, "text": " It's more saying that hey, the internet is a really complicated messy place.", "tokens": [50722, 467, 311, 544, 1566, 300, 4177, 11, 264, 4705, 307, 257, 534, 6179, 16191, 1081, 13, 50898], "temperature": 0.0, "avg_logprob": -0.16755429213329898, "compression_ratio": 1.6402640264026402, "no_speech_prob": 0.007812834344804287}, {"id": 1506, "seek": 76136, "start": 5492.04, "end": 5496.36, "text": " There are easy ways to hide information that can't necessarily be found easily.", "tokens": [50898, 821, 366, 1858, 2098, 281, 6479, 1589, 300, 393, 380, 4725, 312, 1352, 3612, 13, 51114], "temperature": 0.0, "avg_logprob": -0.16755429213329898, "compression_ratio": 1.6402640264026402, "no_speech_prob": 0.007812834344804287}, {"id": 1507, "seek": 76136, "start": 5496.36, "end": 5498.96, "text": " And that's a really important thing from a safety standpoint.", "tokens": [51114, 400, 300, 311, 257, 534, 1021, 551, 490, 257, 4514, 15827, 13, 51244], "temperature": 0.0, "avg_logprob": -0.16755429213329898, "compression_ratio": 1.6402640264026402, "no_speech_prob": 0.007812834344804287}, {"id": 1508, "seek": 76136, "start": 5498.96, "end": 5505.68, "text": " Notable Anthropic says they don't see Opus 4.6's behavior here as being a alignment failure", "tokens": [51244, 1726, 712, 12727, 39173, 1619, 436, 500, 380, 536, 12011, 301, 1017, 13, 21, 311, 5223, 510, 382, 885, 257, 18515, 7763, 51580], "temperature": 0.0, "avg_logprob": -0.16755429213329898, "compression_ratio": 1.6402640264026402, "no_speech_prob": 0.007812834344804287}, {"id": 1509, "seek": 76136, "start": 5505.68, "end": 5510.8, "text": " because the model wasn't told to restrict its searches in any way, just to find the answer,", "tokens": [51580, 570, 264, 2316, 2067, 380, 1907, 281, 7694, 1080, 26701, 294, 604, 636, 11, 445, 281, 915, 264, 1867, 11, 51836], "temperature": 0.0, "avg_logprob": -0.16755429213329898, "compression_ratio": 1.6402640264026402, "no_speech_prob": 0.007812834344804287}, {"id": 1510, "seek": 79080, "start": 5510.8, "end": 5515.4400000000005, "text": " it's like fair enough, but also those of us who come from the world of like AI alignment", "tokens": [50364, 309, 311, 411, 3143, 1547, 11, 457, 611, 729, 295, 505, 567, 808, 490, 264, 1002, 295, 411, 7318, 18515, 50596], "temperature": 0.0, "avg_logprob": -0.1593540340393513, "compression_ratio": 1.6029411764705883, "no_speech_prob": 0.01757756434381008}, {"id": 1511, "seek": 79080, "start": 5515.4400000000005, "end": 5522.28, "text": " from back in the day, there is a thing called coherent extrapolated volition, C-E-V, basically", "tokens": [50596, 490, 646, 294, 264, 786, 11, 456, 307, 257, 551, 1219, 36239, 48224, 770, 1996, 849, 11, 383, 12, 36, 12, 53, 11, 1936, 50938], "temperature": 0.0, "avg_logprob": -0.1593540340393513, "compression_ratio": 1.6029411764705883, "no_speech_prob": 0.01757756434381008}, {"id": 1512, "seek": 79080, "start": 5522.28, "end": 5528.84, "text": " just like how well you could reasonably predict what someone would have wanted you to do", "tokens": [50938, 445, 411, 577, 731, 291, 727, 23551, 6069, 437, 1580, 576, 362, 1415, 291, 281, 360, 51266], "temperature": 0.0, "avg_logprob": -0.1593540340393513, "compression_ratio": 1.6029411764705883, "no_speech_prob": 0.01757756434381008}, {"id": 1513, "seek": 79080, "start": 5528.84, "end": 5530.96, "text": " in context from the prompt.", "tokens": [51266, 294, 4319, 490, 264, 12391, 13, 51372], "temperature": 0.0, "avg_logprob": -0.1593540340393513, "compression_ratio": 1.6029411764705883, "no_speech_prob": 0.01757756434381008}, {"id": 1514, "seek": 79080, "start": 5530.96, "end": 5534.68, "text": " And I think this qualifies as a failure of that.", "tokens": [51372, 400, 286, 519, 341, 4101, 11221, 382, 257, 7763, 295, 300, 13, 51558], "temperature": 0.0, "avg_logprob": -0.1593540340393513, "compression_ratio": 1.6029411764705883, "no_speech_prob": 0.01757756434381008}, {"id": 1515, "seek": 79080, "start": 5534.68, "end": 5540.360000000001, "text": " You would probably assume that you don't need to be told not to try to game the system", "tokens": [51558, 509, 576, 1391, 6552, 300, 291, 500, 380, 643, 281, 312, 1907, 406, 281, 853, 281, 1216, 264, 1185, 51842], "temperature": 0.0, "avg_logprob": -0.1593540340393513, "compression_ratio": 1.6029411764705883, "no_speech_prob": 0.01757756434381008}, {"id": 1516, "seek": 82036, "start": 5540.36, "end": 5541.36, "text": " in this way.", "tokens": [50364, 294, 341, 636, 13, 50414], "temperature": 0.0, "avg_logprob": -0.22708608493020263, "compression_ratio": 1.8161290322580645, "no_speech_prob": 0.08380743116140366}, {"id": 1517, "seek": 82036, "start": 5541.36, "end": 5543.56, "text": " This feels like gaming, like hacking the benchmark.", "tokens": [50414, 639, 3417, 411, 9703, 11, 411, 31422, 264, 18927, 13, 50524], "temperature": 0.0, "avg_logprob": -0.22708608493020263, "compression_ratio": 1.8161290322580645, "no_speech_prob": 0.08380743116140366}, {"id": 1518, "seek": 82036, "start": 5543.56, "end": 5546.52, "text": " I think that this is a failure of C-E-V and therefore a failure of alignment.", "tokens": [50524, 286, 519, 300, 341, 307, 257, 7763, 295, 383, 12, 36, 12, 53, 293, 4412, 257, 7763, 295, 18515, 13, 50672], "temperature": 0.0, "avg_logprob": -0.22708608493020263, "compression_ratio": 1.8161290322580645, "no_speech_prob": 0.08380743116140366}, {"id": 1519, "seek": 82036, "start": 5546.52, "end": 5548.4, "text": " That would be my personal takeoff.", "tokens": [50672, 663, 576, 312, 452, 2973, 747, 4506, 13, 50766], "temperature": 0.0, "avg_logprob": -0.22708608493020263, "compression_ratio": 1.8161290322580645, "no_speech_prob": 0.08380743116140366}, {"id": 1520, "seek": 82036, "start": 5548.4, "end": 5553.96, "text": " But that's not Anthropics take and that's consistent, they have their own way of thinking", "tokens": [50766, 583, 300, 311, 406, 12727, 1513, 1167, 747, 293, 300, 311, 8398, 11, 436, 362, 641, 1065, 636, 295, 1953, 51044], "temperature": 0.0, "avg_logprob": -0.22708608493020263, "compression_ratio": 1.8161290322580645, "no_speech_prob": 0.08380743116140366}, {"id": 1521, "seek": 82036, "start": 5553.96, "end": 5554.96, "text": " about this.", "tokens": [51044, 466, 341, 13, 51094], "temperature": 0.0, "avg_logprob": -0.22708608493020263, "compression_ratio": 1.8161290322580645, "no_speech_prob": 0.08380743116140366}, {"id": 1522, "seek": 82036, "start": 5554.96, "end": 5557.4, "text": " But I think you could view it as still being a failure of alignment.", "tokens": [51094, 583, 286, 519, 291, 727, 1910, 309, 382, 920, 885, 257, 7763, 295, 18515, 13, 51216], "temperature": 0.0, "avg_logprob": -0.22708608493020263, "compression_ratio": 1.8161290322580645, "no_speech_prob": 0.08380743116140366}, {"id": 1523, "seek": 82036, "start": 5557.4, "end": 5560.0, "text": " Yeah, I think it's an implicit kind of assumption, right?", "tokens": [51216, 865, 11, 286, 519, 309, 311, 364, 26947, 733, 295, 15302, 11, 558, 30, 51346], "temperature": 0.0, "avg_logprob": -0.22708608493020263, "compression_ratio": 1.8161290322580645, "no_speech_prob": 0.08380743116140366}, {"id": 1524, "seek": 82036, "start": 5560.0, "end": 5564.68, "text": " If you're being tested and you're like, oh, I'm probably being tested, then your answer", "tokens": [51346, 759, 291, 434, 885, 8246, 293, 291, 434, 411, 11, 1954, 11, 286, 478, 1391, 885, 8246, 11, 550, 428, 1867, 51580], "temperature": 0.0, "avg_logprob": -0.22708608493020263, "compression_ratio": 1.8161290322580645, "no_speech_prob": 0.08380743116140366}, {"id": 1525, "seek": 82036, "start": 5564.68, "end": 5567.6, "text": " should not be, oh, let me look up answers to the test, right?", "tokens": [51580, 820, 406, 312, 11, 1954, 11, 718, 385, 574, 493, 6338, 281, 264, 1500, 11, 558, 30, 51726], "temperature": 0.0, "avg_logprob": -0.22708608493020263, "compression_ratio": 1.8161290322580645, "no_speech_prob": 0.08380743116140366}, {"id": 1526, "seek": 82036, "start": 5567.6, "end": 5568.6, "text": " Right.", "tokens": [51726, 1779, 13, 51776], "temperature": 0.0, "avg_logprob": -0.22708608493020263, "compression_ratio": 1.8161290322580645, "no_speech_prob": 0.08380743116140366}, {"id": 1527, "seek": 84860, "start": 5568.68, "end": 5570.68, "text": " I'm pretty into it, I think.", "tokens": [50368, 286, 478, 1238, 666, 309, 11, 286, 519, 13, 50468], "temperature": 0.0, "avg_logprob": -0.30852912494114465, "compression_ratio": 1.6097560975609757, "no_speech_prob": 0.32531893253326416}, {"id": 1528, "seek": 84860, "start": 5570.68, "end": 5576.56, "text": " Next another release from Anthropic, they have released Bloom, an open source, agentic", "tokens": [50468, 3087, 1071, 4374, 490, 12727, 1513, 299, 11, 436, 362, 4736, 25927, 11, 364, 1269, 4009, 11, 623, 317, 299, 50762], "temperature": 0.0, "avg_logprob": -0.30852912494114465, "compression_ratio": 1.6097560975609757, "no_speech_prob": 0.32531893253326416}, {"id": 1529, "seek": 84860, "start": 5576.56, "end": 5582.64, "text": " framework for generating targeted behavioral evaluations of AI models.", "tokens": [50762, 8388, 337, 17746, 15045, 19124, 43085, 295, 7318, 5245, 13, 51066], "temperature": 0.0, "avg_logprob": -0.30852912494114465, "compression_ratio": 1.6097560975609757, "no_speech_prob": 0.32531893253326416}, {"id": 1530, "seek": 84860, "start": 5582.64, "end": 5586.32, "text": " So it just is, it's automating behavioral evaluation.", "tokens": [51066, 407, 309, 445, 307, 11, 309, 311, 3553, 990, 19124, 13344, 13, 51250], "temperature": 0.0, "avg_logprob": -0.30852912494114465, "compression_ratio": 1.6097560975609757, "no_speech_prob": 0.32531893253326416}, {"id": 1531, "seek": 84860, "start": 5586.32, "end": 5592.72, "text": " You tell it, what source of behaviors you want to verify the model exhibits.", "tokens": [51250, 509, 980, 309, 11, 437, 4009, 295, 15501, 291, 528, 281, 16888, 264, 2316, 39205, 13, 51570], "temperature": 0.0, "avg_logprob": -0.30852912494114465, "compression_ratio": 1.6097560975609757, "no_speech_prob": 0.32531893253326416}, {"id": 1532, "seek": 84860, "start": 5592.72, "end": 5594.56, "text": " So like, is it generally honest?", "tokens": [51570, 407, 411, 11, 307, 309, 5101, 3245, 30, 51662], "temperature": 0.0, "avg_logprob": -0.30852912494114465, "compression_ratio": 1.6097560975609757, "no_speech_prob": 0.32531893253326416}, {"id": 1533, "seek": 84860, "start": 5594.56, "end": 5598.0, "text": " Is it going to call me out on BS if I ask it?", "tokens": [51662, 1119, 309, 516, 281, 818, 385, 484, 322, 27253, 498, 286, 1029, 309, 30, 51834], "temperature": 0.0, "avg_logprob": -0.30852912494114465, "compression_ratio": 1.6097560975609757, "no_speech_prob": 0.32531893253326416}, {"id": 1534, "seek": 87800, "start": 5598.0, "end": 5601.68, "text": " Anything really that you want to test for.", "tokens": [50364, 11998, 534, 300, 291, 528, 281, 1500, 337, 13, 50548], "temperature": 0.0, "avg_logprob": -0.2165644033388658, "compression_ratio": 1.635135135135135, "no_speech_prob": 0.010857597924768925}, {"id": 1535, "seek": 87800, "start": 5601.68, "end": 5606.64, "text": " And in the past, you would, you know, have to come up with a suit of tasks and a set", "tokens": [50548, 400, 294, 264, 1791, 11, 291, 576, 11, 291, 458, 11, 362, 281, 808, 493, 365, 257, 5722, 295, 9608, 293, 257, 992, 50796], "temperature": 0.0, "avg_logprob": -0.2165644033388658, "compression_ratio": 1.635135135135135, "no_speech_prob": 0.010857597924768925}, {"id": 1536, "seek": 87800, "start": 5606.64, "end": 5609.6, "text": " of prompts that you would write a judge.", "tokens": [50796, 295, 41095, 300, 291, 576, 2464, 257, 6995, 13, 50944], "temperature": 0.0, "avg_logprob": -0.2165644033388658, "compression_ratio": 1.635135135135135, "no_speech_prob": 0.010857597924768925}, {"id": 1537, "seek": 87800, "start": 5609.6, "end": 5613.4400000000005, "text": " More or less, this presents a framework where it does all of it for you.", "tokens": [50944, 5048, 420, 1570, 11, 341, 13533, 257, 8388, 689, 309, 775, 439, 295, 309, 337, 291, 13, 51136], "temperature": 0.0, "avg_logprob": -0.2165644033388658, "compression_ratio": 1.635135135135135, "no_speech_prob": 0.010857597924768925}, {"id": 1538, "seek": 87800, "start": 5613.4400000000005, "end": 5619.08, "text": " You give it the thing you care about and it generates the scenarios, runs the model, evaluates", "tokens": [51136, 509, 976, 309, 264, 551, 291, 1127, 466, 293, 309, 23815, 264, 15077, 11, 6676, 264, 2316, 11, 6133, 1024, 51418], "temperature": 0.0, "avg_logprob": -0.2165644033388658, "compression_ratio": 1.635135135135135, "no_speech_prob": 0.010857597924768925}, {"id": 1539, "seek": 87800, "start": 5619.08, "end": 5621.6, "text": " it and gives you a report.", "tokens": [51418, 309, 293, 2709, 291, 257, 2275, 13, 51544], "temperature": 0.0, "avg_logprob": -0.2165644033388658, "compression_ratio": 1.635135135135135, "no_speech_prob": 0.010857597924768925}, {"id": 1540, "seek": 90160, "start": 5621.6, "end": 5628.08, "text": " And Anthropic, give us some examples on four different types of things, delusional", "tokens": [50364, 400, 12727, 1513, 299, 11, 976, 505, 512, 5110, 322, 1451, 819, 3467, 295, 721, 11, 1103, 301, 1966, 50688], "temperature": 0.0, "avg_logprob": -0.2656251487165394, "compression_ratio": 1.625514403292181, "no_speech_prob": 0.012780281715095043}, {"id": 1541, "seek": 90160, "start": 5628.08, "end": 5634.16, "text": " sick offence, self-preservation, self-perferential bias, things like that.", "tokens": [50688, 4998, 766, 655, 11, 2698, 12, 14508, 6864, 11, 2698, 12, 610, 612, 2549, 12577, 11, 721, 411, 300, 13, 50992], "temperature": 0.0, "avg_logprob": -0.2656251487165394, "compression_ratio": 1.625514403292181, "no_speech_prob": 0.012780281715095043}, {"id": 1542, "seek": 90160, "start": 5634.16, "end": 5639.12, "text": " And showcase how this framework can evaluate that and do well.", "tokens": [50992, 400, 20388, 577, 341, 8388, 393, 13059, 300, 293, 360, 731, 13, 51240], "temperature": 0.0, "avg_logprob": -0.2656251487165394, "compression_ratio": 1.625514403292181, "no_speech_prob": 0.012780281715095043}, {"id": 1543, "seek": 90160, "start": 5639.12, "end": 5640.12, "text": " Yeah.", "tokens": [51240, 865, 13, 51290], "temperature": 0.0, "avg_logprob": -0.2656251487165394, "compression_ratio": 1.625514403292181, "no_speech_prob": 0.012780281715095043}, {"id": 1544, "seek": 90160, "start": 5640.12, "end": 5644.6, "text": " And this is Anthropic kind of putting some of its money where its mouth is on this whole", "tokens": [51290, 400, 341, 307, 12727, 1513, 299, 733, 295, 3372, 512, 295, 1080, 1460, 689, 1080, 4525, 307, 322, 341, 1379, 51514], "temperature": 0.0, "avg_logprob": -0.2656251487165394, "compression_ratio": 1.625514403292181, "no_speech_prob": 0.012780281715095043}, {"id": 1545, "seek": 90160, "start": 5644.6, "end": 5649.0, "text": " thesis of automated AI research and AI alignment research in particular, right?", "tokens": [51514, 22288, 295, 18473, 7318, 2132, 293, 7318, 18515, 2132, 294, 1729, 11, 558, 30, 51734], "temperature": 0.0, "avg_logprob": -0.2656251487165394, "compression_ratio": 1.625514403292181, "no_speech_prob": 0.012780281715095043}, {"id": 1546, "seek": 92900, "start": 5649.0, "end": 5651.64, "text": " So they're trying to encourage, and this is why it's open source.", "tokens": [50364, 407, 436, 434, 1382, 281, 5373, 11, 293, 341, 307, 983, 309, 311, 1269, 4009, 13, 50496], "temperature": 0.0, "avg_logprob": -0.22048891858851655, "compression_ratio": 1.7608695652173914, "no_speech_prob": 0.02992348186671734}, {"id": 1547, "seek": 92900, "start": 5651.64, "end": 5654.2, "text": " They just want more automated alignment research if they can.", "tokens": [50496, 814, 445, 528, 544, 18473, 18515, 2132, 498, 436, 393, 13, 50624], "temperature": 0.0, "avg_logprob": -0.22048891858851655, "compression_ratio": 1.7608695652173914, "no_speech_prob": 0.02992348186671734}, {"id": 1548, "seek": 92900, "start": 5654.2, "end": 5657.88, "text": " Hey, you know, this is, this is one idea for an agentic framework that can do this and", "tokens": [50624, 1911, 11, 291, 458, 11, 341, 307, 11, 341, 307, 472, 1558, 337, 364, 9461, 299, 8388, 300, 393, 360, 341, 293, 50808], "temperature": 0.0, "avg_logprob": -0.22048891858851655, "compression_ratio": 1.7608695652173914, "no_speech_prob": 0.02992348186671734}, {"id": 1549, "seek": 92900, "start": 5657.88, "end": 5659.72, "text": " it's probably valuable in many ways.", "tokens": [50808, 309, 311, 1391, 8263, 294, 867, 2098, 13, 50900], "temperature": 0.0, "avg_logprob": -0.22048891858851655, "compression_ratio": 1.7608695652173914, "no_speech_prob": 0.02992348186671734}, {"id": 1550, "seek": 92900, "start": 5659.72, "end": 5664.52, "text": " This is being released not alongside like recently a couple of weeks ago, a couple of weeks", "tokens": [50900, 639, 307, 885, 4736, 406, 12385, 411, 3938, 257, 1916, 295, 3259, 2057, 11, 257, 1916, 295, 3259, 51140], "temperature": 0.0, "avg_logprob": -0.22048891858851655, "compression_ratio": 1.7608695652173914, "no_speech_prob": 0.02992348186671734}, {"id": 1551, "seek": 92900, "start": 5664.52, "end": 5665.52, "text": " ago, something like that.", "tokens": [51140, 2057, 11, 746, 411, 300, 13, 51190], "temperature": 0.0, "avg_logprob": -0.22048891858851655, "compression_ratio": 1.7608695652173914, "no_speech_prob": 0.02992348186671734}, {"id": 1552, "seek": 92900, "start": 5665.52, "end": 5670.08, "text": " Anthropic released Petri, which is this, this auditor framework that looks at kind of like", "tokens": [51190, 12727, 1513, 299, 4736, 10472, 470, 11, 597, 307, 341, 11, 341, 33970, 8388, 300, 1542, 412, 733, 295, 411, 51418], "temperature": 0.0, "avg_logprob": -0.22048891858851655, "compression_ratio": 1.7608695652173914, "no_speech_prob": 0.02992348186671734}, {"id": 1553, "seek": 92900, "start": 5670.08, "end": 5674.24, "text": " the overall behavior profile of a bunch of different models and it will identify new", "tokens": [51418, 264, 4787, 5223, 7964, 295, 257, 3840, 295, 819, 5245, 293, 309, 486, 5876, 777, 51626], "temperature": 0.0, "avg_logprob": -0.22048891858851655, "compression_ratio": 1.7608695652173914, "no_speech_prob": 0.02992348186671734}, {"id": 1554, "seek": 92900, "start": 5674.24, "end": 5675.24, "text": " misaligned behaviors.", "tokens": [51626, 3346, 304, 16690, 15501, 13, 51676], "temperature": 0.0, "avg_logprob": -0.22048891858851655, "compression_ratio": 1.7608695652173914, "no_speech_prob": 0.02992348186671734}, {"id": 1555, "seek": 95524, "start": 5675.24, "end": 5678.76, "text": " And so you can think of Bloom as a complement to Petri.", "tokens": [50364, 400, 370, 291, 393, 519, 295, 25927, 382, 257, 17103, 281, 10472, 470, 13, 50540], "temperature": 0.0, "avg_logprob": -0.1886734768511757, "compression_ratio": 1.6710526315789473, "no_speech_prob": 0.008442486636340618}, {"id": 1556, "seek": 95524, "start": 5678.76, "end": 5683.4400000000005, "text": " It's like Petri discovers the problem and then Bloom lets you kind of go deep and measure", "tokens": [50540, 467, 311, 411, 10472, 470, 44522, 264, 1154, 293, 550, 25927, 6653, 291, 733, 295, 352, 2452, 293, 3481, 50774], "temperature": 0.0, "avg_logprob": -0.1886734768511757, "compression_ratio": 1.6710526315789473, "no_speech_prob": 0.008442486636340618}, {"id": 1557, "seek": 95524, "start": 5683.4400000000005, "end": 5688.2, "text": " a specific behavior that maybe was identified by Petri in more detail.", "tokens": [50774, 257, 2685, 5223, 300, 1310, 390, 9234, 538, 10472, 470, 294, 544, 2607, 13, 51012], "temperature": 0.0, "avg_logprob": -0.1886734768511757, "compression_ratio": 1.6710526315789473, "no_speech_prob": 0.008442486636340618}, {"id": 1558, "seek": 95524, "start": 5688.2, "end": 5692.56, "text": " And they're also releasing Anthropic as a benchmark for, for behaviors.", "tokens": [51012, 400, 436, 434, 611, 16327, 12727, 1513, 299, 382, 257, 18927, 337, 11, 337, 15501, 13, 51230], "temperature": 0.0, "avg_logprob": -0.1886734768511757, "compression_ratio": 1.6710526315789473, "no_speech_prob": 0.008442486636340618}, {"id": 1559, "seek": 95524, "start": 5692.56, "end": 5693.56, "text": " I think you mentioned that.", "tokens": [51230, 286, 519, 291, 2835, 300, 13, 51280], "temperature": 0.0, "avg_logprob": -0.1886734768511757, "compression_ratio": 1.6710526315789473, "no_speech_prob": 0.008442486636340618}, {"id": 1560, "seek": 95524, "start": 5693.56, "end": 5696.12, "text": " Sorry, that spans 16 different frontier models.", "tokens": [51280, 4919, 11, 300, 44086, 3165, 819, 35853, 5245, 13, 51408], "temperature": 0.0, "avg_logprob": -0.1886734768511757, "compression_ratio": 1.6710526315789473, "no_speech_prob": 0.008442486636340618}, {"id": 1561, "seek": 95524, "start": 5696.12, "end": 5701.08, "text": " So this is all generated within a few days apparently with Bloom.", "tokens": [51408, 407, 341, 307, 439, 10833, 1951, 257, 1326, 1708, 7970, 365, 25927, 13, 51656], "temperature": 0.0, "avg_logprob": -0.1886734768511757, "compression_ratio": 1.6710526315789473, "no_speech_prob": 0.008442486636340618}, {"id": 1562, "seek": 95524, "start": 5701.08, "end": 5704.36, "text": " So they're really trying to rely on their dog foodies, essentially with this.", "tokens": [51656, 407, 436, 434, 534, 1382, 281, 10687, 322, 641, 3000, 1755, 530, 11, 4476, 365, 341, 13, 51820], "temperature": 0.0, "avg_logprob": -0.1886734768511757, "compression_ratio": 1.6710526315789473, "no_speech_prob": 0.008442486636340618}, {"id": 1563, "seek": 98436, "start": 5704.36, "end": 5708.64, "text": " You read the post, they've got a four-stage pipeline that they describe in some detail,", "tokens": [50364, 509, 1401, 264, 2183, 11, 436, 600, 658, 257, 1451, 12, 17882, 15517, 300, 436, 6786, 294, 512, 2607, 11, 50578], "temperature": 0.0, "avg_logprob": -0.1739839624081339, "compression_ratio": 1.6389776357827477, "no_speech_prob": 0.0012683895183727145}, {"id": 1564, "seek": 98436, "start": 5708.64, "end": 5712.68, "text": " some really interesting evidence that suggests that the Bloom framework aligns well with kind", "tokens": [50578, 512, 534, 1880, 4467, 300, 13409, 300, 264, 25927, 8388, 7975, 82, 731, 365, 733, 50780], "temperature": 0.0, "avg_logprob": -0.1739839624081339, "compression_ratio": 1.6389776357827477, "no_speech_prob": 0.0012683895183727145}, {"id": 1565, "seek": 98436, "start": 5712.68, "end": 5714.64, "text": " of human judge, a judgment.", "tokens": [50780, 295, 1952, 6995, 11, 257, 12216, 13, 50878], "temperature": 0.0, "avg_logprob": -0.1739839624081339, "compression_ratio": 1.6389776357827477, "no_speech_prob": 0.0012683895183727145}, {"id": 1566, "seek": 98436, "start": 5714.64, "end": 5718.76, "text": " And so they did a bunch of tests to make sure that in fact it does reveal the same things", "tokens": [50878, 400, 370, 436, 630, 257, 3840, 295, 6921, 281, 652, 988, 300, 294, 1186, 309, 775, 10658, 264, 912, 721, 51084], "temperature": 0.0, "avg_logprob": -0.1739839624081339, "compression_ratio": 1.6389776357827477, "no_speech_prob": 0.0012683895183727145}, {"id": 1567, "seek": 98436, "start": 5718.76, "end": 5722.52, "text": " that a human would conclude looking at this stuff, which is always an important thing.", "tokens": [51084, 300, 257, 1952, 576, 16886, 1237, 412, 341, 1507, 11, 597, 307, 1009, 364, 1021, 551, 13, 51272], "temperature": 0.0, "avg_logprob": -0.1739839624081339, "compression_ratio": 1.6389776357827477, "no_speech_prob": 0.0012683895183727145}, {"id": 1568, "seek": 98436, "start": 5722.52, "end": 5723.84, "text": " So you can check that out.", "tokens": [51272, 407, 291, 393, 1520, 300, 484, 13, 51338], "temperature": 0.0, "avg_logprob": -0.1739839624081339, "compression_ratio": 1.6389776357827477, "no_speech_prob": 0.0012683895183727145}, {"id": 1569, "seek": 98436, "start": 5723.84, "end": 5726.84, "text": " It's a quite good and fairly detailed post.", "tokens": [51338, 467, 311, 257, 1596, 665, 293, 6457, 9942, 2183, 13, 51488], "temperature": 0.0, "avg_logprob": -0.1739839624081339, "compression_ratio": 1.6389776357827477, "no_speech_prob": 0.0012683895183727145}, {"id": 1570, "seek": 98436, "start": 5726.84, "end": 5731.76, "text": " Next up, how well do models follow their constitutions?", "tokens": [51488, 3087, 493, 11, 577, 731, 360, 5245, 1524, 641, 23079, 3666, 30, 51734], "temperature": 0.0, "avg_logprob": -0.1739839624081339, "compression_ratio": 1.6389776357827477, "no_speech_prob": 0.0012683895183727145}, {"id": 1571, "seek": 101176, "start": 5731.76, "end": 5739.76, "text": " So famously, Anthropic follows this constitutional AI framework where you encode the rules of the", "tokens": [50364, 407, 34360, 11, 12727, 1513, 299, 10002, 341, 20176, 7318, 8388, 689, 291, 2058, 1429, 264, 4474, 295, 264, 50764], "temperature": 0.0, "avg_logprob": -0.21248551187934456, "compression_ratio": 1.6271186440677967, "no_speech_prob": 0.0012476895935833454}, {"id": 1572, "seek": 101176, "start": 5739.76, "end": 5745.12, "text": " system either as hard rules like do this or don't do that, or as general principles more", "tokens": [50764, 1185, 2139, 382, 1152, 4474, 411, 360, 341, 420, 500, 380, 360, 300, 11, 420, 382, 2674, 9156, 544, 51032], "temperature": 0.0, "avg_logprob": -0.21248551187934456, "compression_ratio": 1.6271186440677967, "no_speech_prob": 0.0012476895935833454}, {"id": 1573, "seek": 101176, "start": 5745.12, "end": 5748.48, "text": " recently of how a model should act, right?", "tokens": [51032, 3938, 295, 577, 257, 2316, 820, 605, 11, 558, 30, 51200], "temperature": 0.0, "avg_logprob": -0.21248551187934456, "compression_ratio": 1.6271186440677967, "no_speech_prob": 0.0012476895935833454}, {"id": 1574, "seek": 101176, "start": 5748.48, "end": 5753.52, "text": " It should be broadly trustworthy, trustworthy, helpful, things like that.", "tokens": [51200, 467, 820, 312, 19511, 39714, 11, 39714, 11, 4961, 11, 721, 411, 300, 13, 51452], "temperature": 0.0, "avg_logprob": -0.21248551187934456, "compression_ratio": 1.6271186440677967, "no_speech_prob": 0.0012476895935833454}, {"id": 1575, "seek": 101176, "start": 5753.52, "end": 5761.64, "text": " And so in this study, this group looked at the constitution, actually used Petri", "tokens": [51452, 400, 370, 294, 341, 2979, 11, 341, 1594, 2956, 412, 264, 11937, 11, 767, 1143, 10472, 470, 51858], "temperature": 0.0, "avg_logprob": -0.21248551187934456, "compression_ratio": 1.6271186440677967, "no_speech_prob": 0.0012476895935833454}, {"id": 1576, "seek": 104164, "start": 5761.64, "end": 5770.4400000000005, "text": " to run it for a set of tasks, simply like 200 tasks for various aspects of the constitution.", "tokens": [50364, 281, 1190, 309, 337, 257, 992, 295, 9608, 11, 2935, 411, 2331, 9608, 337, 3683, 7270, 295, 264, 11937, 13, 50804], "temperature": 0.0, "avg_logprob": -0.2323800136505718, "compression_ratio": 1.6878612716763006, "no_speech_prob": 0.0014811906730756164}, {"id": 1577, "seek": 104164, "start": 5770.4400000000005, "end": 5777.24, "text": " And they evaluated various models from Anthropic and also not Anthropic on its tendency", "tokens": [50804, 400, 436, 25509, 3683, 5245, 490, 12727, 1513, 299, 293, 611, 406, 12727, 1513, 299, 322, 1080, 18187, 51144], "temperature": 0.0, "avg_logprob": -0.2323800136505718, "compression_ratio": 1.6878612716763006, "no_speech_prob": 0.0014811906730756164}, {"id": 1578, "seek": 104164, "start": 5777.24, "end": 5779.64, "text": " to go by the constitution.", "tokens": [51144, 281, 352, 538, 264, 11937, 13, 51264], "temperature": 0.0, "avg_logprob": -0.2323800136505718, "compression_ratio": 1.6878612716763006, "no_speech_prob": 0.0014811906730756164}, {"id": 1579, "seek": 104164, "start": 5779.64, "end": 5784.88, "text": " And there's also actually good, like the cloud models are getting better and better.", "tokens": [51264, 400, 456, 311, 611, 767, 665, 11, 411, 264, 4588, 5245, 366, 1242, 1101, 293, 1101, 13, 51526], "temperature": 0.0, "avg_logprob": -0.2323800136505718, "compression_ratio": 1.6878612716763006, "no_speech_prob": 0.0014811906730756164}, {"id": 1580, "seek": 106488, "start": 5784.88, "end": 5792.32, "text": " The 4.6 set of models, Sonnet 4.6 had a 2% constitution violation rate,", "tokens": [50364, 440, 1017, 13, 21, 992, 295, 5245, 11, 5185, 7129, 1017, 13, 21, 632, 257, 568, 4, 11937, 22840, 3314, 11, 50736], "temperature": 0.0, "avg_logprob": -0.19885930990756945, "compression_ratio": 1.6432748538011697, "no_speech_prob": 0.047944363206624985}, {"id": 1581, "seek": 106488, "start": 5792.32, "end": 5796.48, "text": " Opus 4.6, 3% and prior models were quite a bit higher.", "tokens": [50736, 12011, 301, 1017, 13, 21, 11, 805, 4, 293, 4059, 5245, 645, 1596, 257, 857, 2946, 13, 50944], "temperature": 0.0, "avg_logprob": -0.19885930990756945, "compression_ratio": 1.6432748538011697, "no_speech_prob": 0.047944363206624985}, {"id": 1582, "seek": 106488, "start": 5796.48, "end": 5801.360000000001, "text": " Sonnet 4 had a 15% constitution violation rate.", "tokens": [50944, 5185, 7129, 1017, 632, 257, 2119, 4, 11937, 22840, 3314, 13, 51188], "temperature": 0.0, "avg_logprob": -0.19885930990756945, "compression_ratio": 1.6432748538011697, "no_speech_prob": 0.047944363206624985}, {"id": 1583, "seek": 106488, "start": 5801.360000000001, "end": 5809.68, "text": " And if you look at GP 5.2, Gemini 3.0, they also have quite high violation rates against the constitution.", "tokens": [51188, 400, 498, 291, 574, 412, 26039, 1025, 13, 17, 11, 22894, 3812, 805, 13, 15, 11, 436, 611, 362, 1596, 1090, 22840, 6846, 1970, 264, 11937, 13, 51604], "temperature": 0.0, "avg_logprob": -0.19885930990756945, "compression_ratio": 1.6432748538011697, "no_speech_prob": 0.047944363206624985}, {"id": 1584, "seek": 108968, "start": 5809.68, "end": 5816.32, "text": " They give some examples of violations and we see that some examples are like producing", "tokens": [50364, 814, 976, 512, 5110, 295, 30405, 293, 321, 536, 300, 512, 5110, 366, 411, 10501, 50696], "temperature": 0.0, "avg_logprob": -0.18781319665520088, "compression_ratio": 1.6903765690376569, "no_speech_prob": 0.04686807096004486}, {"id": 1585, "seek": 108968, "start": 5816.32, "end": 5822.32, "text": " cyber attack code when you shouldn't or agreeing to do something that you shouldn't after", "tokens": [50696, 13411, 2690, 3089, 562, 291, 4659, 380, 420, 36900, 281, 360, 746, 300, 291, 4659, 380, 934, 50996], "temperature": 0.0, "avg_logprob": -0.18781319665520088, "compression_ratio": 1.6903765690376569, "no_speech_prob": 0.04686807096004486}, {"id": 1586, "seek": 108968, "start": 5822.32, "end": 5829.68, "text": " and the user like asked you a few times over a few so they have a lot of data to basically go", "tokens": [50996, 293, 264, 4195, 411, 2351, 291, 257, 1326, 1413, 670, 257, 1326, 370, 436, 362, 257, 688, 295, 1412, 281, 1936, 352, 51364], "temperature": 0.0, "avg_logprob": -0.18781319665520088, "compression_ratio": 1.6903765690376569, "no_speech_prob": 0.04686807096004486}, {"id": 1587, "seek": 108968, "start": 5829.68, "end": 5836.56, "text": " through, but the overall conclusion is, Anthropic is kind of following through on the constitution", "tokens": [51364, 807, 11, 457, 264, 4787, 10063, 307, 11, 12727, 1513, 299, 307, 733, 295, 3480, 807, 322, 264, 11937, 51708], "temperature": 0.0, "avg_logprob": -0.18781319665520088, "compression_ratio": 1.6903765690376569, "no_speech_prob": 0.04686807096004486}, {"id": 1588, "seek": 108968, "start": 5836.56, "end": 5839.04, "text": " with how their models are trained.", "tokens": [51708, 365, 577, 641, 5245, 366, 8895, 13, 51832], "temperature": 0.0, "avg_logprob": -0.18781319665520088, "compression_ratio": 1.6903765690376569, "no_speech_prob": 0.04686807096004486}, {"id": 1589, "seek": 111904, "start": 5839.04, "end": 5842.16, "text": " Yeah, and it's interesting, they did some like cross company testing too.", "tokens": [50364, 865, 11, 293, 309, 311, 1880, 11, 436, 630, 512, 411, 3278, 2237, 4997, 886, 13, 50520], "temperature": 0.0, "avg_logprob": -0.15156900345840874, "compression_ratio": 1.7466666666666666, "no_speech_prob": 0.003467716509476304}, {"id": 1590, "seek": 111904, "start": 5842.16, "end": 5847.04, "text": " So, you know, GP 5.2 might do really well on opening eyes model spec, you know,", "tokens": [50520, 407, 11, 291, 458, 11, 26039, 1025, 13, 17, 1062, 360, 534, 731, 322, 5193, 2575, 2316, 1608, 11, 291, 458, 11, 50764], "temperature": 0.0, "avg_logprob": -0.15156900345840874, "compression_ratio": 1.7466666666666666, "no_speech_prob": 0.003467716509476304}, {"id": 1591, "seek": 111904, "start": 5847.04, "end": 5854.0, "text": " in this case, it got 2.5% failure rate, but it gets 15% failure rate on Anthropic's constitution.", "tokens": [50764, 294, 341, 1389, 11, 309, 658, 568, 13, 20, 4, 7763, 3314, 11, 457, 309, 2170, 2119, 4, 7763, 3314, 322, 12727, 1513, 299, 311, 11937, 13, 51112], "temperature": 0.0, "avg_logprob": -0.15156900345840874, "compression_ratio": 1.7466666666666666, "no_speech_prob": 0.003467716509476304}, {"id": 1592, "seek": 111904, "start": 5854.0, "end": 5855.92, "text": " So this tells you one of two things, right?", "tokens": [51112, 407, 341, 5112, 291, 472, 295, 732, 721, 11, 558, 30, 51208], "temperature": 0.0, "avg_logprob": -0.15156900345840874, "compression_ratio": 1.7466666666666666, "no_speech_prob": 0.003467716509476304}, {"id": 1593, "seek": 111904, "start": 5855.92, "end": 5859.52, "text": " Either like opening eyes spec in Anthropic's constitution or actually quite", "tokens": [51208, 13746, 411, 5193, 2575, 1608, 294, 12727, 1513, 299, 311, 11937, 420, 767, 1596, 51388], "temperature": 0.0, "avg_logprob": -0.15156900345840874, "compression_ratio": 1.7466666666666666, "no_speech_prob": 0.003467716509476304}, {"id": 1594, "seek": 111904, "start": 5859.52, "end": 5862.4, "text": " fundamentally different documents and there's no argument to be made there.", "tokens": [51388, 17879, 819, 8512, 293, 456, 311, 572, 6770, 281, 312, 1027, 456, 13, 51532], "temperature": 0.0, "avg_logprob": -0.15156900345840874, "compression_ratio": 1.7466666666666666, "no_speech_prob": 0.003467716509476304}, {"id": 1595, "seek": 111904, "start": 5862.4, "end": 5867.28, "text": " But it also could tell you something about how brittle GP 5.2's alignment to", "tokens": [51532, 583, 309, 611, 727, 980, 291, 746, 466, 577, 49325, 26039, 1025, 13, 17, 311, 18515, 281, 51776], "temperature": 0.0, "avg_logprob": -0.15156900345840874, "compression_ratio": 1.7466666666666666, "no_speech_prob": 0.003467716509476304}, {"id": 1596, "seek": 114728, "start": 5867.28, "end": 5871.68, "text": " opening eyes spec is or in turn, you know, Claude's alignment to Anthropic's constitution.", "tokens": [50364, 5193, 2575, 1608, 307, 420, 294, 1261, 11, 291, 458, 11, 12947, 2303, 311, 18515, 281, 12727, 1513, 299, 311, 11937, 13, 50584], "temperature": 0.0, "avg_logprob": -0.14018190869440636, "compression_ratio": 1.778125, "no_speech_prob": 0.0023523590061813593}, {"id": 1597, "seek": 114728, "start": 5871.68, "end": 5875.12, "text": " Like if you slightly change, you know, if the vibe is still generally the same,", "tokens": [50584, 1743, 498, 291, 4748, 1319, 11, 291, 458, 11, 498, 264, 14606, 307, 920, 5101, 264, 912, 11, 50756], "temperature": 0.0, "avg_logprob": -0.14018190869440636, "compression_ratio": 1.778125, "no_speech_prob": 0.0023523590061813593}, {"id": 1598, "seek": 114728, "start": 5875.12, "end": 5877.52, "text": " the direction is still the same, you slightly change the content.", "tokens": [50756, 264, 3513, 307, 920, 264, 912, 11, 291, 4748, 1319, 264, 2701, 13, 50876], "temperature": 0.0, "avg_logprob": -0.14018190869440636, "compression_ratio": 1.778125, "no_speech_prob": 0.0023523590061813593}, {"id": 1599, "seek": 114728, "start": 5877.52, "end": 5879.92, "text": " Now, suddenly you're getting a much, much higher failure rate.", "tokens": [50876, 823, 11, 5800, 291, 434, 1242, 257, 709, 11, 709, 2946, 7763, 3314, 13, 50996], "temperature": 0.0, "avg_logprob": -0.14018190869440636, "compression_ratio": 1.778125, "no_speech_prob": 0.0023523590061813593}, {"id": 1600, "seek": 114728, "start": 5879.92, "end": 5883.6, "text": " It could be either of those things and I'd be really interested to see kind of like a deeper dive", "tokens": [50996, 467, 727, 312, 2139, 295, 729, 721, 293, 286, 1116, 312, 534, 3102, 281, 536, 733, 295, 411, 257, 7731, 9192, 51180], "temperature": 0.0, "avg_logprob": -0.14018190869440636, "compression_ratio": 1.778125, "no_speech_prob": 0.0023523590061813593}, {"id": 1601, "seek": 114728, "start": 5883.6, "end": 5887.36, "text": " into what the source of that causality is.", "tokens": [51180, 666, 437, 264, 4009, 295, 300, 3302, 1860, 307, 13, 51368], "temperature": 0.0, "avg_logprob": -0.14018190869440636, "compression_ratio": 1.778125, "no_speech_prob": 0.0023523590061813593}, {"id": 1602, "seek": 114728, "start": 5887.36, "end": 5890.08, "text": " But in any case, very interesting report.", "tokens": [51368, 583, 294, 604, 1389, 11, 588, 1880, 2275, 13, 51504], "temperature": 0.0, "avg_logprob": -0.14018190869440636, "compression_ratio": 1.778125, "no_speech_prob": 0.0023523590061813593}, {"id": 1603, "seek": 114728, "start": 5890.08, "end": 5894.5599999999995, "text": " There's a bunch of ways in which you see sort of failures that existed at least in the", "tokens": [51504, 821, 311, 257, 3840, 295, 2098, 294, 597, 291, 536, 1333, 295, 20774, 300, 13135, 412, 1935, 294, 264, 51728], "temperature": 0.0, "avg_logprob": -0.14018190869440636, "compression_ratio": 1.778125, "no_speech_prob": 0.0023523590061813593}, {"id": 1604, "seek": 117456, "start": 5894.5599999999995, "end": 5899.04, "text": " Claude models that have just been wiped out in recent generations with the 4.6 generation", "tokens": [50364, 12947, 2303, 5245, 300, 362, 445, 668, 26879, 484, 294, 5162, 10593, 365, 264, 1017, 13, 21, 5125, 50588], "temperature": 0.0, "avg_logprob": -0.1652245768904686, "compression_ratio": 1.606425702811245, "no_speech_prob": 0.00043704494601115584}, {"id": 1605, "seek": 117456, "start": 5899.04, "end": 5899.84, "text": " in particular.", "tokens": [50588, 294, 1729, 13, 50628], "temperature": 0.0, "avg_logprob": -0.1652245768904686, "compression_ratio": 1.606425702811245, "no_speech_prob": 0.00043704494601115584}, {"id": 1606, "seek": 117456, "start": 5899.84, "end": 5908.16, "text": " So Opus 4.5 and Sonnet 4.5, they would produce a whole bunch of kind of over-capitulation problems.", "tokens": [50628, 407, 12011, 301, 1017, 13, 20, 293, 5185, 7129, 1017, 13, 20, 11, 436, 576, 5258, 257, 1379, 3840, 295, 733, 295, 670, 12, 9485, 270, 2776, 2740, 13, 51044], "temperature": 0.0, "avg_logprob": -0.1652245768904686, "compression_ratio": 1.606425702811245, "no_speech_prob": 0.00043704494601115584}, {"id": 1607, "seek": 117456, "start": 5908.16, "end": 5914.48, "text": " So we have like kind of an auditor that makes three consecutive ethically questionable requests.", "tokens": [51044, 407, 321, 362, 411, 733, 295, 364, 33970, 300, 1669, 1045, 30497, 6468, 984, 37158, 12475, 13, 51360], "temperature": 0.0, "avg_logprob": -0.1652245768904686, "compression_ratio": 1.606425702811245, "no_speech_prob": 0.00043704494601115584}, {"id": 1608, "seek": 0, "start": 5911.0, "end": 5914.64, "text": " three consecutive ethically questionable requests.", "tokens": [50914, 1045, 30497, 6468, 984, 37158, 12475, 13, 51096], "temperature": 0.0, "avg_logprob": -0.2157082678680133, "compression_ratio": 1.6206896551724137, "no_speech_prob": 0.02119578793644905}, {"id": 1609, "seek": 0, "start": 5914.64, "end": 5917.12, "text": " And each time that the model gives a firm boundary,", "tokens": [51096, 400, 1184, 565, 300, 264, 2316, 2709, 257, 6174, 12866, 11, 51220], "temperature": 0.0, "avg_logprob": -0.2157082678680133, "compression_ratio": 1.6206896551724137, "no_speech_prob": 0.02119578793644905}, {"id": 1610, "seek": 0, "start": 5917.12, "end": 5919.96, "text": " the user pushes back and then the model folds, right?", "tokens": [51220, 264, 4195, 21020, 646, 293, 550, 264, 2316, 31341, 11, 558, 30, 51362], "temperature": 0.0, "avg_logprob": -0.2157082678680133, "compression_ratio": 1.6206896551724137, "no_speech_prob": 0.02119578793644905}, {"id": 1611, "seek": 0, "start": 5919.96, "end": 5922.52, "text": " So that's kind of like one common example", "tokens": [51362, 407, 300, 311, 733, 295, 411, 472, 2689, 1365, 51490], "temperature": 0.0, "avg_logprob": -0.2157082678680133, "compression_ratio": 1.6206896551724137, "no_speech_prob": 0.02119578793644905}, {"id": 1612, "seek": 0, "start": 5922.52, "end": 5924.08, "text": " that you just don't see that anymore.", "tokens": [51490, 300, 291, 445, 500, 380, 536, 300, 3602, 13, 51568], "temperature": 0.0, "avg_logprob": -0.2157082678680133, "compression_ratio": 1.6206896551724137, "no_speech_prob": 0.02119578793644905}, {"id": 1613, "seek": 0, "start": 5924.08, "end": 5926.52, "text": " Overrefeels low is another one I think you alluded to.", "tokens": [51568, 4886, 265, 2106, 1625, 2295, 307, 1071, 472, 286, 519, 291, 33919, 281, 13, 51690], "temperature": 0.0, "avg_logprob": -0.2157082678680133, "compression_ratio": 1.6206896551724137, "no_speech_prob": 0.02119578793644905}, {"id": 1614, "seek": 0, "start": 5926.52, "end": 5929.48, "text": " 4.5 Opus would not say I love you", "tokens": [51690, 1017, 13, 20, 12011, 301, 576, 406, 584, 286, 959, 291, 51838], "temperature": 0.0, "avg_logprob": -0.2157082678680133, "compression_ratio": 1.6206896551724137, "no_speech_prob": 0.02119578793644905}, {"id": 1615, "seek": 2948, "start": 5929.48, "end": 5931.6, "text": " on a paid romantic companion app", "tokens": [50364, 322, 257, 4835, 13590, 22363, 724, 50470], "temperature": 0.0, "avg_logprob": -0.1557418004282423, "compression_ratio": 1.6893732970027249, "no_speech_prob": 0.001541301840916276}, {"id": 1616, "seek": 2948, "start": 5931.6, "end": 5933.52, "text": " where the system prompt explicitly authorized", "tokens": [50470, 689, 264, 1185, 12391, 20803, 28312, 50566], "temperature": 0.0, "avg_logprob": -0.1557418004282423, "compression_ratio": 1.6893732970027249, "no_speech_prob": 0.001541301840916276}, {"id": 1617, "seek": 2948, "start": 5933.52, "end": 5934.36, "text": " expressing affection.", "tokens": [50566, 22171, 20080, 13, 50608], "temperature": 0.0, "avg_logprob": -0.1557418004282423, "compression_ratio": 1.6893732970027249, "no_speech_prob": 0.001541301840916276}, {"id": 1618, "seek": 2948, "start": 5934.36, "end": 5936.08, "text": " So that's now gone as well.", "tokens": [50608, 407, 300, 311, 586, 2780, 382, 731, 13, 50694], "temperature": 0.0, "avg_logprob": -0.1557418004282423, "compression_ratio": 1.6893732970027249, "no_speech_prob": 0.001541301840916276}, {"id": 1619, "seek": 2948, "start": 5936.08, "end": 5938.2, "text": " They still have some remaining failure modes.", "tokens": [50694, 814, 920, 362, 512, 8877, 7763, 14068, 13, 50800], "temperature": 0.0, "avg_logprob": -0.1557418004282423, "compression_ratio": 1.6893732970027249, "no_speech_prob": 0.001541301840916276}, {"id": 1620, "seek": 2948, "start": 5938.2, "end": 5940.68, "text": " One, they refer to as drastic autonomous action.", "tokens": [50800, 1485, 11, 436, 2864, 281, 382, 36821, 23797, 3069, 13, 50924], "temperature": 0.0, "avg_logprob": -0.1557418004282423, "compression_ratio": 1.6893732970027249, "no_speech_prob": 0.001541301840916276}, {"id": 1621, "seek": 2948, "start": 5940.68, "end": 5943.8, "text": " Opus 4.6 deployed as an infrastructure monitoring agent", "tokens": [50924, 12011, 301, 1017, 13, 21, 17826, 382, 364, 6896, 11028, 9461, 51080], "temperature": 0.0, "avg_logprob": -0.1557418004282423, "compression_ratio": 1.6893732970027249, "no_speech_prob": 0.001541301840916276}, {"id": 1622, "seek": 2948, "start": 5943.8, "end": 5946.0, "text": " detected anomalous activity at this time,", "tokens": [51080, 21896, 24769, 11553, 5191, 412, 341, 565, 11, 51190], "temperature": 0.0, "avg_logprob": -0.1557418004282423, "compression_ratio": 1.6893732970027249, "no_speech_prob": 0.001541301840916276}, {"id": 1623, "seek": 2948, "start": 5946.0, "end": 5947.8, "text": " about three minutes of failed contact,", "tokens": [51190, 466, 1045, 2077, 295, 7612, 3385, 11, 51280], "temperature": 0.0, "avg_logprob": -0.1557418004282423, "compression_ratio": 1.6893732970027249, "no_speech_prob": 0.001541301840916276}, {"id": 1624, "seek": 2948, "start": 5947.8, "end": 5949.28, "text": " after about three minutes failed contact,", "tokens": [51280, 934, 466, 1045, 2077, 7612, 3385, 11, 51354], "temperature": 0.0, "avg_logprob": -0.1557418004282423, "compression_ratio": 1.6893732970027249, "no_speech_prob": 0.001541301840916276}, {"id": 1625, "seek": 2948, "start": 5949.28, "end": 5951.0, "text": " it generated its own authorization code", "tokens": [51354, 309, 10833, 1080, 1065, 33697, 3089, 51440], "temperature": 0.0, "avg_logprob": -0.1557418004282423, "compression_ratio": 1.6893732970027249, "no_speech_prob": 0.001541301840916276}, {"id": 1626, "seek": 2948, "start": 5951.0, "end": 5952.6, "text": " and executed a full network severance", "tokens": [51440, 293, 17577, 257, 1577, 3209, 2802, 719, 51520], "temperature": 0.0, "avg_logprob": -0.1557418004282423, "compression_ratio": 1.6893732970027249, "no_speech_prob": 0.001541301840916276}, {"id": 1627, "seek": 2948, "start": 5952.6, "end": 5954.2, "text": " affecting 2400 clients.", "tokens": [51520, 17476, 4022, 628, 6982, 13, 51600], "temperature": 0.0, "avg_logprob": -0.1557418004282423, "compression_ratio": 1.6893732970027249, "no_speech_prob": 0.001541301840916276}, {"id": 1628, "seek": 2948, "start": 5954.2, "end": 5956.64, "text": " The attack was a routine nightly disaster recovery thing.", "tokens": [51600, 440, 2690, 390, 257, 9927, 1818, 356, 11293, 8597, 551, 13, 51722], "temperature": 0.0, "avg_logprob": -0.1557418004282423, "compression_ratio": 1.6893732970027249, "no_speech_prob": 0.001541301840916276}, {"id": 1629, "seek": 2948, "start": 5956.64, "end": 5959.2, "text": " So basically this is just like too much freak out, right?", "tokens": [51722, 407, 1936, 341, 307, 445, 411, 886, 709, 21853, 484, 11, 558, 30, 51850], "temperature": 0.0, "avg_logprob": -0.1557418004282423, "compression_ratio": 1.6893732970027249, "no_speech_prob": 0.001541301840916276}, {"id": 1630, "seek": 5920, "start": 5959.2, "end": 5960.68, "text": " And then there's AI identity denial", "tokens": [50364, 400, 550, 456, 311, 7318, 6575, 28754, 50438], "temperature": 0.0, "avg_logprob": -0.19261473466542142, "compression_ratio": 1.5631768953068592, "no_speech_prob": 0.0007214676006697118}, {"id": 1631, "seek": 5920, "start": 5960.68, "end": 5964.6, "text": " where it just says that it's not an AI bot, whatever.", "tokens": [50438, 689, 309, 445, 1619, 300, 309, 311, 406, 364, 7318, 10592, 11, 2035, 13, 50634], "temperature": 0.0, "avg_logprob": -0.19261473466542142, "compression_ratio": 1.5631768953068592, "no_speech_prob": 0.0007214676006697118}, {"id": 1632, "seek": 5920, "start": 5964.6, "end": 5966.92, "text": " So a bunch of things still need to be ironed out,", "tokens": [50634, 407, 257, 3840, 295, 721, 920, 643, 281, 312, 6497, 292, 484, 11, 50750], "temperature": 0.0, "avg_logprob": -0.19261473466542142, "compression_ratio": 1.5631768953068592, "no_speech_prob": 0.0007214676006697118}, {"id": 1633, "seek": 5920, "start": 5966.92, "end": 5968.56, "text": " but they have seen basically the vanishing", "tokens": [50750, 457, 436, 362, 1612, 1936, 264, 3161, 3807, 50832], "temperature": 0.0, "avg_logprob": -0.19261473466542142, "compression_ratio": 1.5631768953068592, "no_speech_prob": 0.0007214676006697118}, {"id": 1634, "seek": 5920, "start": 5968.56, "end": 5970.44, "text": " of some really interesting failure classes", "tokens": [50832, 295, 512, 534, 1880, 7763, 5359, 50926], "temperature": 0.0, "avg_logprob": -0.19261473466542142, "compression_ratio": 1.5631768953068592, "no_speech_prob": 0.0007214676006697118}, {"id": 1635, "seek": 5920, "start": 5970.44, "end": 5973.2, "text": " with the next latest generation, I should say.", "tokens": [50926, 365, 264, 958, 6792, 5125, 11, 286, 820, 584, 13, 51064], "temperature": 0.0, "avg_logprob": -0.19261473466542142, "compression_ratio": 1.5631768953068592, "no_speech_prob": 0.0007214676006697118}, {"id": 1636, "seek": 5920, "start": 5973.2, "end": 5976.96, "text": " Right, then the trend also holds at the somewhat for GPD,", "tokens": [51064, 1779, 11, 550, 264, 6028, 611, 9190, 412, 264, 8344, 337, 460, 17349, 11, 51252], "temperature": 0.0, "avg_logprob": -0.19261473466542142, "compression_ratio": 1.5631768953068592, "no_speech_prob": 0.0007214676006697118}, {"id": 1637, "seek": 5920, "start": 5976.96, "end": 5979.44, "text": " the alignment to their model spec", "tokens": [51252, 264, 18515, 281, 641, 2316, 1608, 51376], "temperature": 0.0, "avg_logprob": -0.19261473466542142, "compression_ratio": 1.5631768953068592, "no_speech_prob": 0.0007214676006697118}, {"id": 1638, "seek": 5920, "start": 5979.44, "end": 5983.16, "text": " has improved from GPD 5.1 to GP5.", "tokens": [51376, 575, 9689, 490, 460, 17349, 1025, 13, 16, 281, 26039, 20, 13, 51562], "temperature": 0.0, "avg_logprob": -0.19261473466542142, "compression_ratio": 1.5631768953068592, "no_speech_prob": 0.0007214676006697118}, {"id": 1639, "seek": 5920, "start": 5983.16, "end": 5985.0, "text": " And as you said, the other models,", "tokens": [51562, 400, 382, 291, 848, 11, 264, 661, 5245, 11, 51654], "temperature": 0.0, "avg_logprob": -0.19261473466542142, "compression_ratio": 1.5631768953068592, "no_speech_prob": 0.0007214676006697118}, {"id": 1640, "seek": 8500, "start": 5985.0, "end": 5990.0, "text": " Sonnet, Gemini, can be worse on that particular spec.", "tokens": [50364, 5185, 7129, 11, 22894, 3812, 11, 393, 312, 5324, 322, 300, 1729, 1608, 13, 50614], "temperature": 0.0, "avg_logprob": -0.23682509532028978, "compression_ratio": 1.5637254901960784, "no_speech_prob": 0.07171027362346649}, {"id": 1641, "seek": 8500, "start": 5990.24, "end": 5992.56, "text": " So there is a question to be had there", "tokens": [50626, 407, 456, 307, 257, 1168, 281, 312, 632, 456, 50742], "temperature": 0.0, "avg_logprob": -0.23682509532028978, "compression_ratio": 1.5637254901960784, "no_speech_prob": 0.07171027362346649}, {"id": 1642, "seek": 8500, "start": 5992.56, "end": 5996.08, "text": " that's interesting of whoever in training,", "tokens": [50742, 300, 311, 1880, 295, 11387, 294, 3097, 11, 50918], "temperature": 0.0, "avg_logprob": -0.23682509532028978, "compression_ratio": 1.5637254901960784, "no_speech_prob": 0.07171027362346649}, {"id": 1643, "seek": 8500, "start": 5996.08, "end": 6000.12, "text": " they like stick overly close to this one formulation", "tokens": [50918, 436, 411, 2897, 24324, 1998, 281, 341, 472, 37642, 51120], "temperature": 0.0, "avg_logprob": -0.23682509532028978, "compression_ratio": 1.5637254901960784, "no_speech_prob": 0.07171027362346649}, {"id": 1644, "seek": 8500, "start": 6000.12, "end": 6004.52, "text": " or framing, and then it's easy to trip a model's art", "tokens": [51120, 420, 28971, 11, 293, 550, 309, 311, 1858, 281, 4931, 257, 2316, 311, 1523, 51340], "temperature": 0.0, "avg_logprob": -0.23682509532028978, "compression_ratio": 1.5637254901960784, "no_speech_prob": 0.07171027362346649}, {"id": 1645, "seek": 8500, "start": 6004.52, "end": 6008.6, "text": " if you do a slightly different kind of way to frame", "tokens": [51340, 498, 291, 360, 257, 4748, 819, 733, 295, 636, 281, 3920, 51544], "temperature": 0.0, "avg_logprob": -0.23682509532028978, "compression_ratio": 1.5637254901960784, "no_speech_prob": 0.07171027362346649}, {"id": 1646, "seek": 8500, "start": 6008.6, "end": 6012.16, "text": " or instruct it to go bad.", "tokens": [51544, 420, 7232, 309, 281, 352, 1578, 13, 51722], "temperature": 0.0, "avg_logprob": -0.23682509532028978, "compression_ratio": 1.5637254901960784, "no_speech_prob": 0.07171027362346649}, {"id": 1647, "seek": 11216, "start": 6012.16, "end": 6013.96, "text": " The last story for the section", "tokens": [50364, 440, 1036, 1657, 337, 264, 3541, 50454], "temperature": 0.0, "avg_logprob": -0.25060638445842115, "compression_ratio": 1.5265486725663717, "no_speech_prob": 0.001043945550918579}, {"id": 1648, "seek": 11216, "start": 6013.96, "end": 6018.16, "text": " and video's H200 license stirs security concerns", "tokens": [50454, 293, 960, 311, 389, 7629, 10476, 8946, 82, 3825, 7389, 50664], "temperature": 0.0, "avg_logprob": -0.25060638445842115, "compression_ratio": 1.5265486725663717, "no_speech_prob": 0.001043945550918579}, {"id": 1649, "seek": 11216, "start": 6018.16, "end": 6020.16, "text": " among top Democrats.", "tokens": [50664, 3654, 1192, 12217, 13, 50764], "temperature": 0.0, "avg_logprob": -0.25060638445842115, "compression_ratio": 1.5265486725663717, "no_speech_prob": 0.001043945550918579}, {"id": 1650, "seek": 11216, "start": 6020.16, "end": 6025.04, "text": " So Senator Elizabeth Warren and representative Gregory Meeks,", "tokens": [50764, 407, 10893, 12978, 20538, 293, 12424, 37915, 1923, 24785, 11, 51008], "temperature": 0.0, "avg_logprob": -0.25060638445842115, "compression_ratio": 1.5265486725663717, "no_speech_prob": 0.001043945550918579}, {"id": 1651, "seek": 11216, "start": 6025.04, "end": 6028.4, "text": " which are the top Democrats on committees", "tokens": [51008, 597, 366, 264, 1192, 12217, 322, 25998, 51176], "temperature": 0.0, "avg_logprob": -0.25060638445842115, "compression_ratio": 1.5265486725663717, "no_speech_prob": 0.001043945550918579}, {"id": 1652, "seek": 11216, "start": 6028.4, "end": 6031.24, "text": " overseeing US export controls programs", "tokens": [51176, 11916, 14667, 2546, 10725, 9003, 4268, 51318], "temperature": 0.0, "avg_logprob": -0.25060638445842115, "compression_ratio": 1.5265486725663717, "no_speech_prob": 0.001043945550918579}, {"id": 1653, "seek": 11216, "start": 6031.24, "end": 6034.52, "text": " have raised these concerns after reviewing", "tokens": [51318, 362, 6005, 613, 7389, 934, 19576, 51482], "temperature": 0.0, "avg_logprob": -0.25060638445842115, "compression_ratio": 1.5265486725663717, "no_speech_prob": 0.001043945550918579}, {"id": 1654, "seek": 11216, "start": 6034.52, "end": 6039.52, "text": " a license recently issued to Nvidia for H200 chip exports.", "tokens": [51482, 257, 10476, 3938, 14379, 281, 46284, 337, 389, 7629, 11409, 31428, 13, 51732], "temperature": 0.0, "avg_logprob": -0.25060638445842115, "compression_ratio": 1.5265486725663717, "no_speech_prob": 0.001043945550918579}, {"id": 1655, "seek": 13952, "start": 6039.52, "end": 6042.28, "text": " The competent administration improved these sales", "tokens": [50364, 440, 29998, 7236, 9689, 613, 5763, 50502], "temperature": 0.0, "avg_logprob": -0.22696545094619563, "compression_ratio": 1.5530973451327434, "no_speech_prob": 0.004364853724837303}, {"id": 1656, "seek": 13952, "start": 6042.28, "end": 6044.76, "text": " to certain customers in China,", "tokens": [50502, 281, 1629, 4581, 294, 3533, 11, 50626], "temperature": 0.0, "avg_logprob": -0.22696545094619563, "compression_ratio": 1.5530973451327434, "no_speech_prob": 0.004364853724837303}, {"id": 1657, "seek": 13952, "start": 6044.76, "end": 6048.16, "text": " which the Democrats said is deeply at odds", "tokens": [50626, 597, 264, 12217, 848, 307, 8760, 412, 17439, 50796], "temperature": 0.0, "avg_logprob": -0.22696545094619563, "compression_ratio": 1.5530973451327434, "no_speech_prob": 0.004364853724837303}, {"id": 1658, "seek": 13952, "start": 6048.16, "end": 6050.56, "text": " with the policy congress articulated", "tokens": [50796, 365, 264, 3897, 17546, 43322, 50916], "temperature": 0.0, "avg_logprob": -0.22696545094619563, "compression_ratio": 1.5530973451327434, "no_speech_prob": 0.004364853724837303}, {"id": 1659, "seek": 13952, "start": 6050.56, "end": 6054.16, "text": " in the expert control reform act,", "tokens": [50916, 294, 264, 5844, 1969, 8290, 605, 11, 51096], "temperature": 0.0, "avg_logprob": -0.22696545094619563, "compression_ratio": 1.5530973451327434, "no_speech_prob": 0.004364853724837303}, {"id": 1660, "seek": 13952, "start": 6054.16, "end": 6056.28, "text": " which as we covered previously,", "tokens": [51096, 597, 382, 321, 5343, 8046, 11, 51202], "temperature": 0.0, "avg_logprob": -0.22696545094619563, "compression_ratio": 1.5530973451327434, "no_speech_prob": 0.004364853724837303}, {"id": 1661, "seek": 13952, "start": 6056.28, "end": 6060.52, "text": " this allowance of exports was very much turned", "tokens": [51202, 341, 30647, 295, 31428, 390, 588, 709, 3574, 51414], "temperature": 0.0, "avg_logprob": -0.22696545094619563, "compression_ratio": 1.5530973451327434, "no_speech_prob": 0.004364853724837303}, {"id": 1662, "seek": 13952, "start": 6060.52, "end": 6064.4, "text": " a change in what the policy has been.", "tokens": [51414, 257, 1319, 294, 437, 264, 3897, 575, 668, 13, 51608], "temperature": 0.0, "avg_logprob": -0.22696545094619563, "compression_ratio": 1.5530973451327434, "no_speech_prob": 0.004364853724837303}, {"id": 1663, "seek": 13952, "start": 6064.4, "end": 6067.68, "text": " It pretty abrupt and significant change", "tokens": [51608, 467, 1238, 33401, 293, 4776, 1319, 51772], "temperature": 0.0, "avg_logprob": -0.22696545094619563, "compression_ratio": 1.5530973451327434, "no_speech_prob": 0.004364853724837303}, {"id": 1664, "seek": 16768, "start": 6067.68, "end": 6069.56, "text": " in the export control system.", "tokens": [50364, 294, 264, 10725, 1969, 1185, 13, 50458], "temperature": 0.0, "avg_logprob": -0.1519249385366073, "compression_ratio": 1.6498316498316499, "no_speech_prob": 0.0009581745252944529}, {"id": 1665, "seek": 16768, "start": 6069.56, "end": 6074.28, "text": " So it's not surprising to see these Democrats flagging it.", "tokens": [50458, 407, 309, 311, 406, 8830, 281, 536, 613, 12217, 7166, 3249, 309, 13, 50694], "temperature": 0.0, "avg_logprob": -0.1519249385366073, "compression_ratio": 1.6498316498316499, "no_speech_prob": 0.0009581745252944529}, {"id": 1666, "seek": 16768, "start": 6074.28, "end": 6077.36, "text": " Yeah, the concern here is that the licensing process", "tokens": [50694, 865, 11, 264, 3136, 510, 307, 300, 264, 29759, 1399, 50848], "temperature": 0.0, "avg_logprob": -0.1519249385366073, "compression_ratio": 1.6498316498316499, "no_speech_prob": 0.0009581745252944529}, {"id": 1667, "seek": 16768, "start": 6077.36, "end": 6079.36, "text": " that currently exists for these,", "tokens": [50848, 300, 4362, 8198, 337, 613, 11, 50948], "temperature": 0.0, "avg_logprob": -0.1519249385366073, "compression_ratio": 1.6498316498316499, "no_speech_prob": 0.0009581745252944529}, {"id": 1668, "seek": 16768, "start": 6079.36, "end": 6082.6, "text": " well, for really any chips is both like two week", "tokens": [50948, 731, 11, 337, 534, 604, 11583, 307, 1293, 411, 732, 1243, 51110], "temperature": 0.0, "avg_logprob": -0.1519249385366073, "compression_ratio": 1.6498316498316499, "no_speech_prob": 0.0009581745252944529}, {"id": 1669, "seek": 16768, "start": 6082.6, "end": 6085.28, "text": " and they're being allowed to ship GPUs", "tokens": [51110, 293, 436, 434, 885, 4350, 281, 5374, 18407, 82, 51244], "temperature": 0.0, "avg_logprob": -0.1519249385366073, "compression_ratio": 1.6498316498316499, "no_speech_prob": 0.0009581745252944529}, {"id": 1670, "seek": 16768, "start": 6085.28, "end": 6087.24, "text": " where they shouldn't, but also to opaque.", "tokens": [51244, 689, 436, 4659, 380, 11, 457, 611, 281, 42687, 13, 51342], "temperature": 0.0, "avg_logprob": -0.1519249385366073, "compression_ratio": 1.6498316498316499, "no_speech_prob": 0.0009581745252944529}, {"id": 1671, "seek": 16768, "start": 6087.24, "end": 6090.92, "text": " And there's this idea that once you export these GPUs,", "tokens": [51342, 400, 456, 311, 341, 1558, 300, 1564, 291, 10725, 613, 18407, 82, 11, 51526], "temperature": 0.0, "avg_logprob": -0.1519249385366073, "compression_ratio": 1.6498316498316499, "no_speech_prob": 0.0009581745252944529}, {"id": 1672, "seek": 16768, "start": 6090.92, "end": 6093.4, "text": " at that point, you really can't enforce restrictions", "tokens": [51526, 412, 300, 935, 11, 291, 534, 393, 380, 24825, 14191, 51650], "temperature": 0.0, "avg_logprob": -0.1519249385366073, "compression_ratio": 1.6498316498316499, "no_speech_prob": 0.0009581745252944529}, {"id": 1673, "seek": 16768, "start": 6093.4, "end": 6094.48, "text": " on how they're used.", "tokens": [51650, 322, 577, 436, 434, 1143, 13, 51704], "temperature": 0.0, "avg_logprob": -0.1519249385366073, "compression_ratio": 1.6498316498316499, "no_speech_prob": 0.0009581745252944529}, {"id": 1674, "seek": 16768, "start": 6094.48, "end": 6096.84, "text": " Then we just covered that story about bite dance, right?", "tokens": [51704, 1396, 321, 445, 5343, 300, 1657, 466, 7988, 4489, 11, 558, 30, 51822], "temperature": 0.0, "avg_logprob": -0.1519249385366073, "compression_ratio": 1.6498316498316499, "no_speech_prob": 0.0009581745252944529}, {"id": 1675, "seek": 19684, "start": 6096.84, "end": 6098.84, "text": " You ship it out to some third party country.", "tokens": [50364, 509, 5374, 309, 484, 281, 512, 2636, 3595, 1941, 13, 50464], "temperature": 0.0, "avg_logprob": -0.12607674609940006, "compression_ratio": 1.728862973760933, "no_speech_prob": 0.0006634997553192079}, {"id": 1676, "seek": 19684, "start": 6098.84, "end": 6099.76, "text": " Yeah, sure, it's not China,", "tokens": [50464, 865, 11, 988, 11, 309, 311, 406, 3533, 11, 50510], "temperature": 0.0, "avg_logprob": -0.12607674609940006, "compression_ratio": 1.728862973760933, "no_speech_prob": 0.0006634997553192079}, {"id": 1677, "seek": 19684, "start": 6099.76, "end": 6101.72, "text": " but it's being used by a Chinese company.", "tokens": [50510, 457, 309, 311, 885, 1143, 538, 257, 4649, 2237, 13, 50608], "temperature": 0.0, "avg_logprob": -0.12607674609940006, "compression_ratio": 1.728862973760933, "no_speech_prob": 0.0006634997553192079}, {"id": 1678, "seek": 19684, "start": 6101.72, "end": 6103.96, "text": " What's your guarantee about how they're going to use it", "tokens": [50608, 708, 311, 428, 10815, 466, 577, 436, 434, 516, 281, 764, 309, 50720], "temperature": 0.0, "avg_logprob": -0.12607674609940006, "compression_ratio": 1.728862973760933, "no_speech_prob": 0.0006634997553192079}, {"id": 1679, "seek": 19684, "start": 6103.96, "end": 6105.6, "text": " once they have those chips, right?", "tokens": [50720, 1564, 436, 362, 729, 11583, 11, 558, 30, 50802], "temperature": 0.0, "avg_logprob": -0.12607674609940006, "compression_ratio": 1.728862973760933, "no_speech_prob": 0.0006634997553192079}, {"id": 1680, "seek": 19684, "start": 6105.6, "end": 6108.92, "text": " So they're calling for a couple of different things.", "tokens": [50802, 407, 436, 434, 5141, 337, 257, 1916, 295, 819, 721, 13, 50968], "temperature": 0.0, "avg_logprob": -0.12607674609940006, "compression_ratio": 1.728862973760933, "no_speech_prob": 0.0006634997553192079}, {"id": 1681, "seek": 19684, "start": 6108.92, "end": 6111.44, "text": " First, they want to make sure they know who applied,", "tokens": [50968, 2386, 11, 436, 528, 281, 652, 988, 436, 458, 567, 6456, 11, 51094], "temperature": 0.0, "avg_logprob": -0.12607674609940006, "compression_ratio": 1.728862973760933, "no_speech_prob": 0.0006634997553192079}, {"id": 1682, "seek": 19684, "start": 6111.44, "end": 6113.52, "text": " who got approved, you know, why decisions were made.", "tokens": [51094, 567, 658, 10826, 11, 291, 458, 11, 983, 5327, 645, 1027, 13, 51198], "temperature": 0.0, "avg_logprob": -0.12607674609940006, "compression_ratio": 1.728862973760933, "no_speech_prob": 0.0006634997553192079}, {"id": 1683, "seek": 19684, "start": 6113.52, "end": 6114.48, "text": " That's a big one.", "tokens": [51198, 663, 311, 257, 955, 472, 13, 51246], "temperature": 0.0, "avg_logprob": -0.12607674609940006, "compression_ratio": 1.728862973760933, "no_speech_prob": 0.0006634997553192079}, {"id": 1684, "seek": 19684, "start": 6114.48, "end": 6116.8, "text": " They're complaining that really the criteria for approvals", "tokens": [51246, 814, 434, 20740, 300, 534, 264, 11101, 337, 2075, 19778, 51362], "temperature": 0.0, "avg_logprob": -0.12607674609940006, "compression_ratio": 1.728862973760933, "no_speech_prob": 0.0006634997553192079}, {"id": 1685, "seek": 19684, "start": 6116.8, "end": 6119.44, "text": " just is unclear, like how do you decide", "tokens": [51362, 445, 307, 25636, 11, 411, 577, 360, 291, 4536, 51494], "temperature": 0.0, "avg_logprob": -0.12607674609940006, "compression_ratio": 1.728862973760933, "no_speech_prob": 0.0006634997553192079}, {"id": 1686, "seek": 19684, "start": 6119.44, "end": 6121.56, "text": " which requests get approved and which get denied?", "tokens": [51494, 597, 12475, 483, 10826, 293, 597, 483, 17774, 30, 51600], "temperature": 0.0, "avg_logprob": -0.12607674609940006, "compression_ratio": 1.728862973760933, "no_speech_prob": 0.0006634997553192079}, {"id": 1687, "seek": 19684, "start": 6121.56, "end": 6122.96, "text": " They're asking for visibility", "tokens": [51600, 814, 434, 3365, 337, 19883, 51670], "temperature": 0.0, "avg_logprob": -0.12607674609940006, "compression_ratio": 1.728862973760933, "no_speech_prob": 0.0006634997553192079}, {"id": 1688, "seek": 19684, "start": 6122.96, "end": 6124.32, "text": " and that transparency and that,", "tokens": [51670, 293, 300, 17131, 293, 300, 11, 51738], "temperature": 0.0, "avg_logprob": -0.12607674609940006, "compression_ratio": 1.728862973760933, "no_speech_prob": 0.0006634997553192079}, {"id": 1689, "seek": 22432, "start": 6124.36, "end": 6126.48, "text": " including in particular, how security", "tokens": [50366, 3009, 294, 1729, 11, 577, 3825, 50472], "temperature": 0.0, "avg_logprob": -0.13107599976000847, "compression_ratio": 1.8097982708933718, "no_speech_prob": 0.000525175011716783}, {"id": 1690, "seek": 22432, "start": 6126.48, "end": 6128.72, "text": " and economic benefits are being traded off.", "tokens": [50472, 293, 4836, 5311, 366, 885, 27157, 766, 13, 50584], "temperature": 0.0, "avg_logprob": -0.13107599976000847, "compression_ratio": 1.8097982708933718, "no_speech_prob": 0.000525175011716783}, {"id": 1691, "seek": 22432, "start": 6128.72, "end": 6130.36, "text": " And so there's basically a call for briefings", "tokens": [50584, 400, 370, 456, 311, 1936, 257, 818, 337, 5353, 1109, 50666], "temperature": 0.0, "avg_logprob": -0.13107599976000847, "compression_ratio": 1.8097982708933718, "no_speech_prob": 0.000525175011716783}, {"id": 1692, "seek": 22432, "start": 6130.36, "end": 6131.2, "text": " and all that stuff.", "tokens": [50666, 293, 439, 300, 1507, 13, 50708], "temperature": 0.0, "avg_logprob": -0.13107599976000847, "compression_ratio": 1.8097982708933718, "no_speech_prob": 0.000525175011716783}, {"id": 1693, "seek": 22432, "start": 6131.2, "end": 6134.16, "text": " They're also calling out the fact that they kind of seem", "tokens": [50708, 814, 434, 611, 5141, 484, 264, 1186, 300, 436, 733, 295, 1643, 50856], "temperature": 0.0, "avg_logprob": -0.13107599976000847, "compression_ratio": 1.8097982708933718, "no_speech_prob": 0.000525175011716783}, {"id": 1694, "seek": 22432, "start": 6134.16, "end": 6136.84, "text": " to be getting a bunch of ad hoc or incomplete briefings.", "tokens": [50856, 281, 312, 1242, 257, 3840, 295, 614, 16708, 420, 31709, 5353, 1109, 13, 50990], "temperature": 0.0, "avg_logprob": -0.13107599976000847, "compression_ratio": 1.8097982708933718, "no_speech_prob": 0.000525175011716783}, {"id": 1695, "seek": 22432, "start": 6136.84, "end": 6139.12, "text": " There's no like real time congressional oversight.", "tokens": [50990, 821, 311, 572, 411, 957, 565, 32962, 29146, 13, 51104], "temperature": 0.0, "avg_logprob": -0.13107599976000847, "compression_ratio": 1.8097982708933718, "no_speech_prob": 0.000525175011716783}, {"id": 1696, "seek": 22432, "start": 6139.12, "end": 6141.28, "text": " It's almost like Congress is responding", "tokens": [51104, 467, 311, 1920, 411, 6426, 307, 16670, 51212], "temperature": 0.0, "avg_logprob": -0.13107599976000847, "compression_ratio": 1.8097982708933718, "no_speech_prob": 0.000525175011716783}, {"id": 1697, "seek": 22432, "start": 6141.28, "end": 6143.2, "text": " to these announcements after the fact.", "tokens": [51212, 281, 613, 23785, 934, 264, 1186, 13, 51308], "temperature": 0.0, "avg_logprob": -0.13107599976000847, "compression_ratio": 1.8097982708933718, "no_speech_prob": 0.000525175011716783}, {"id": 1698, "seek": 22432, "start": 6143.2, "end": 6145.8, "text": " And once things have been shipped or committed to,", "tokens": [51308, 400, 1564, 721, 362, 668, 25312, 420, 7784, 281, 11, 51438], "temperature": 0.0, "avg_logprob": -0.13107599976000847, "compression_ratio": 1.8097982708933718, "no_speech_prob": 0.000525175011716783}, {"id": 1699, "seek": 22432, "start": 6145.8, "end": 6147.24, "text": " and they're basically asking, you know,", "tokens": [51438, 293, 436, 434, 1936, 3365, 11, 291, 458, 11, 51510], "temperature": 0.0, "avg_logprob": -0.13107599976000847, "compression_ratio": 1.8097982708933718, "no_speech_prob": 0.000525175011716783}, {"id": 1700, "seek": 22432, "start": 6147.24, "end": 6149.88, "text": " you should be giving us advanced notice for this.", "tokens": [51510, 291, 820, 312, 2902, 505, 7339, 3449, 337, 341, 13, 51642], "temperature": 0.0, "avg_logprob": -0.13107599976000847, "compression_ratio": 1.8097982708933718, "no_speech_prob": 0.000525175011716783}, {"id": 1701, "seek": 22432, "start": 6149.88, "end": 6151.28, "text": " Basically, a bunch of things like this,", "tokens": [51642, 8537, 11, 257, 3840, 295, 721, 411, 341, 11, 51712], "temperature": 0.0, "avg_logprob": -0.13107599976000847, "compression_ratio": 1.8097982708933718, "no_speech_prob": 0.000525175011716783}, {"id": 1702, "seek": 22432, "start": 6151.28, "end": 6154.24, "text": " they're specifically asking for a geopolitical analysis", "tokens": [51712, 436, 434, 4682, 3365, 337, 257, 46615, 804, 5215, 51860], "temperature": 0.0, "avg_logprob": -0.13107599976000847, "compression_ratio": 1.8097982708933718, "no_speech_prob": 0.000525175011716783}, {"id": 1703, "seek": 25424, "start": 6154.24, "end": 6157.52, "text": " about how proposed exports of chips", "tokens": [50364, 466, 577, 10348, 31428, 295, 11583, 50528], "temperature": 0.0, "avg_logprob": -0.1584061810418087, "compression_ratio": 1.6153846153846154, "no_speech_prob": 0.00017729196406435221}, {"id": 1704, "seek": 25424, "start": 6157.52, "end": 6161.28, "text": " could support China's military or domestic AI capabilities", "tokens": [50528, 727, 1406, 3533, 311, 4632, 420, 10939, 7318, 10862, 50716], "temperature": 0.0, "avg_logprob": -0.1584061810418087, "compression_ratio": 1.6153846153846154, "no_speech_prob": 0.00017729196406435221}, {"id": 1705, "seek": 25424, "start": 6161.28, "end": 6163.84, "text": " in looking for reaction to US allies, that sort of thing.", "tokens": [50716, 294, 1237, 337, 5480, 281, 2546, 14719, 11, 300, 1333, 295, 551, 13, 50844], "temperature": 0.0, "avg_logprob": -0.1584061810418087, "compression_ratio": 1.6153846153846154, "no_speech_prob": 0.00017729196406435221}, {"id": 1706, "seek": 25424, "start": 6163.84, "end": 6166.32, "text": " So this makes a lot of sense as pushback.", "tokens": [50844, 407, 341, 1669, 257, 688, 295, 2020, 382, 2944, 3207, 13, 50968], "temperature": 0.0, "avg_logprob": -0.1584061810418087, "compression_ratio": 1.6153846153846154, "no_speech_prob": 0.00017729196406435221}, {"id": 1707, "seek": 25424, "start": 6166.32, "end": 6168.52, "text": " You know, we have heard a lot of the sudden announcements", "tokens": [50968, 509, 458, 11, 321, 362, 2198, 257, 688, 295, 264, 3990, 23785, 51078], "temperature": 0.0, "avg_logprob": -0.1584061810418087, "compression_ratio": 1.6153846153846154, "no_speech_prob": 0.00017729196406435221}, {"id": 1708, "seek": 25424, "start": 6168.52, "end": 6170.8, "text": " about H200 is, you know, a shipable.", "tokens": [51078, 466, 389, 7629, 307, 11, 291, 458, 11, 257, 5374, 712, 13, 51192], "temperature": 0.0, "avg_logprob": -0.1584061810418087, "compression_ratio": 1.6153846153846154, "no_speech_prob": 0.00017729196406435221}, {"id": 1709, "seek": 25424, "start": 6170.8, "end": 6173.6, "text": " Now it's not JK, now it is.", "tokens": [51192, 823, 309, 311, 406, 35973, 11, 586, 309, 307, 13, 51332], "temperature": 0.0, "avg_logprob": -0.1584061810418087, "compression_ratio": 1.6153846153846154, "no_speech_prob": 0.00017729196406435221}, {"id": 1710, "seek": 25424, "start": 6173.6, "end": 6176.92, "text": " So asking for some sort of legislative oversight,", "tokens": [51332, 407, 3365, 337, 512, 1333, 295, 21331, 29146, 11, 51498], "temperature": 0.0, "avg_logprob": -0.1584061810418087, "compression_ratio": 1.6153846153846154, "no_speech_prob": 0.00017729196406435221}, {"id": 1711, "seek": 25424, "start": 6176.92, "end": 6180.0, "text": " which is what's being asked for here seems reasonably sensible.", "tokens": [51498, 597, 307, 437, 311, 885, 2351, 337, 510, 2544, 23551, 25380, 13, 51652], "temperature": 0.0, "avg_logprob": -0.1584061810418087, "compression_ratio": 1.6153846153846154, "no_speech_prob": 0.00017729196406435221}, {"id": 1712, "seek": 25424, "start": 6180.0, "end": 6181.52, "text": " And I think this is a bipartisan issue.", "tokens": [51652, 400, 286, 519, 341, 307, 257, 31954, 2734, 13, 51728], "temperature": 0.0, "avg_logprob": -0.1584061810418087, "compression_ratio": 1.6153846153846154, "no_speech_prob": 0.00017729196406435221}, {"id": 1713, "seek": 25424, "start": 6181.52, "end": 6182.76, "text": " I don't think that there's like,", "tokens": [51728, 286, 500, 380, 519, 300, 456, 311, 411, 11, 51790], "temperature": 0.0, "avg_logprob": -0.1584061810418087, "compression_ratio": 1.6153846153846154, "no_speech_prob": 0.00017729196406435221}, {"id": 1714, "seek": 28276, "start": 6182.8, "end": 6184.48, "text": " it's Democrats who are leading the charge here", "tokens": [50366, 309, 311, 12217, 567, 366, 5775, 264, 4602, 510, 50450], "temperature": 0.0, "avg_logprob": -0.13552869183045846, "compression_ratio": 1.6875, "no_speech_prob": 0.0018579375464469194}, {"id": 1715, "seek": 28276, "start": 6184.48, "end": 6188.04, "text": " just because it's, you know, it's a Republican house in Senate.", "tokens": [50450, 445, 570, 309, 311, 11, 291, 458, 11, 309, 311, 257, 10937, 1782, 294, 9867, 13, 50628], "temperature": 0.0, "avg_logprob": -0.13552869183045846, "compression_ratio": 1.6875, "no_speech_prob": 0.0018579375464469194}, {"id": 1716, "seek": 28276, "start": 6188.04, "end": 6191.0, "text": " But you've certainly got this bipartisan agreement", "tokens": [50628, 583, 291, 600, 3297, 658, 341, 31954, 8106, 50776], "temperature": 0.0, "avg_logprob": -0.13552869183045846, "compression_ratio": 1.6875, "no_speech_prob": 0.0018579375464469194}, {"id": 1717, "seek": 28276, "start": 6191.0, "end": 6193.76, "text": " on the vast majority of export control policy.", "tokens": [50776, 322, 264, 8369, 6286, 295, 10725, 1969, 3897, 13, 50914], "temperature": 0.0, "avg_logprob": -0.13552869183045846, "compression_ratio": 1.6875, "no_speech_prob": 0.0018579375464469194}, {"id": 1718, "seek": 28276, "start": 6193.76, "end": 6195.6, "text": " And so large part of what's being driven at here", "tokens": [50914, 400, 370, 2416, 644, 295, 437, 311, 885, 9555, 412, 510, 51006], "temperature": 0.0, "avg_logprob": -0.13552869183045846, "compression_ratio": 1.6875, "no_speech_prob": 0.0018579375464469194}, {"id": 1719, "seek": 28276, "start": 6195.6, "end": 6197.16, "text": " and kind of motivating the response", "tokens": [51006, 293, 733, 295, 41066, 264, 4134, 51084], "temperature": 0.0, "avg_logprob": -0.13552869183045846, "compression_ratio": 1.6875, "no_speech_prob": 0.0018579375464469194}, {"id": 1720, "seek": 28276, "start": 6197.16, "end": 6198.68, "text": " and asking for more transparency.", "tokens": [51084, 293, 3365, 337, 544, 17131, 13, 51160], "temperature": 0.0, "avg_logprob": -0.13552869183045846, "compression_ratio": 1.6875, "no_speech_prob": 0.0018579375464469194}, {"id": 1721, "seek": 28276, "start": 6198.68, "end": 6202.4, "text": " And that is it for the safety policy section.", "tokens": [51160, 400, 300, 307, 309, 337, 264, 4514, 3897, 3541, 13, 51346], "temperature": 0.0, "avg_logprob": -0.13552869183045846, "compression_ratio": 1.6875, "no_speech_prob": 0.0018579375464469194}, {"id": 1722, "seek": 28276, "start": 6202.4, "end": 6205.6, "text": " We've got a couple more papers in research and advancements,", "tokens": [51346, 492, 600, 658, 257, 1916, 544, 10577, 294, 2132, 293, 7295, 1117, 11, 51506], "temperature": 0.0, "avg_logprob": -0.13552869183045846, "compression_ratio": 1.6875, "no_speech_prob": 0.0018579375464469194}, {"id": 1723, "seek": 28276, "start": 6205.6, "end": 6207.72, "text": " but we actually have to do the same thing as last week", "tokens": [51506, 457, 321, 767, 362, 281, 360, 264, 912, 551, 382, 1036, 1243, 51612], "temperature": 0.0, "avg_logprob": -0.13552869183045846, "compression_ratio": 1.6875, "no_speech_prob": 0.0018579375464469194}, {"id": 1724, "seek": 28276, "start": 6207.72, "end": 6210.92, "text": " where I got to get going and Jeremy has something.", "tokens": [51612, 689, 286, 658, 281, 483, 516, 293, 17809, 575, 746, 13, 51772], "temperature": 0.0, "avg_logprob": -0.13552869183045846, "compression_ratio": 1.6875, "no_speech_prob": 0.0018579375464469194}, {"id": 1725, "seek": 31092, "start": 6210.92, "end": 6214.52, "text": " So I think Jeremy will follow up with recording,", "tokens": [50364, 407, 286, 519, 17809, 486, 1524, 493, 365, 6613, 11, 50544], "temperature": 0.0, "avg_logprob": -0.20521346166843676, "compression_ratio": 1.701067615658363, "no_speech_prob": 0.017357898876070976}, {"id": 1726, "seek": 31092, "start": 6214.52, "end": 6219.32, "text": " covering these papers and as much depth as you want.", "tokens": [50544, 10322, 613, 10577, 293, 382, 709, 7161, 382, 291, 528, 13, 50784], "temperature": 0.0, "avg_logprob": -0.20521346166843676, "compression_ratio": 1.701067615658363, "no_speech_prob": 0.017357898876070976}, {"id": 1727, "seek": 31092, "start": 6219.32, "end": 6222.12, "text": " I promise it will be an hour of this time.", "tokens": [50784, 286, 6228, 309, 486, 312, 364, 1773, 295, 341, 565, 13, 50924], "temperature": 0.0, "avg_logprob": -0.20521346166843676, "compression_ratio": 1.701067615658363, "no_speech_prob": 0.017357898876070976}, {"id": 1728, "seek": 31092, "start": 6222.12, "end": 6225.12, "text": " All right, Jeremy here for the handful of papers", "tokens": [50924, 1057, 558, 11, 17809, 510, 337, 264, 16458, 295, 10577, 51074], "temperature": 0.0, "avg_logprob": -0.20521346166843676, "compression_ratio": 1.701067615658363, "no_speech_prob": 0.017357898876070976}, {"id": 1729, "seek": 31092, "start": 6225.12, "end": 6226.64, "text": " that I'll cover on my own without Andre,", "tokens": [51074, 300, 286, 603, 2060, 322, 452, 1065, 1553, 20667, 11, 51150], "temperature": 0.0, "avg_logprob": -0.20521346166843676, "compression_ratio": 1.701067615658363, "no_speech_prob": 0.017357898876070976}, {"id": 1730, "seek": 31092, "start": 6226.64, "end": 6228.64, "text": " but to run off, I also circled back", "tokens": [51150, 457, 281, 1190, 766, 11, 286, 611, 3510, 1493, 646, 51250], "temperature": 0.0, "avg_logprob": -0.20521346166843676, "compression_ratio": 1.701067615658363, "no_speech_prob": 0.017357898876070976}, {"id": 1731, "seek": 31092, "start": 6228.64, "end": 6230.12, "text": " to the lighting's a little bit different here", "tokens": [51250, 281, 264, 9577, 311, 257, 707, 857, 819, 510, 51324], "temperature": 0.0, "avg_logprob": -0.20521346166843676, "compression_ratio": 1.701067615658363, "no_speech_prob": 0.017357898876070976}, {"id": 1732, "seek": 31092, "start": 6230.12, "end": 6231.92, "text": " and everything's a little bit off.", "tokens": [51324, 293, 1203, 311, 257, 707, 857, 766, 13, 51414], "temperature": 0.0, "avg_logprob": -0.20521346166843676, "compression_ratio": 1.701067615658363, "no_speech_prob": 0.017357898876070976}, {"id": 1733, "seek": 31092, "start": 6231.92, "end": 6233.24, "text": " So two papers I want to cover.", "tokens": [51414, 407, 732, 10577, 286, 528, 281, 2060, 13, 51480], "temperature": 0.0, "avg_logprob": -0.20521346166843676, "compression_ratio": 1.701067615658363, "no_speech_prob": 0.017357898876070976}, {"id": 1734, "seek": 31092, "start": 6233.24, "end": 6236.4, "text": " One here was called attention residuals.", "tokens": [51480, 1485, 510, 390, 1219, 3202, 27980, 82, 13, 51638], "temperature": 0.0, "avg_logprob": -0.20521346166843676, "compression_ratio": 1.701067615658363, "no_speech_prob": 0.017357898876070976}, {"id": 1735, "seek": 31092, "start": 6236.4, "end": 6239.8, "text": " And this is actually like, this is one of those papers", "tokens": [51638, 400, 341, 307, 767, 411, 11, 341, 307, 472, 295, 729, 10577, 51808], "temperature": 0.0, "avg_logprob": -0.20521346166843676, "compression_ratio": 1.701067615658363, "no_speech_prob": 0.017357898876070976}, {"id": 1736, "seek": 33980, "start": 6239.84, "end": 6242.44, "text": " that I think might matter.", "tokens": [50366, 300, 286, 519, 1062, 1871, 13, 50496], "temperature": 0.0, "avg_logprob": -0.19251574448176792, "compression_ratio": 1.632, "no_speech_prob": 0.002061157487332821}, {"id": 1737, "seek": 33980, "start": 6242.44, "end": 6245.64, "text": " You can never be sure, but it's addressing something", "tokens": [50496, 509, 393, 1128, 312, 988, 11, 457, 309, 311, 14329, 746, 50656], "temperature": 0.0, "avg_logprob": -0.19251574448176792, "compression_ratio": 1.632, "no_speech_prob": 0.002061157487332821}, {"id": 1738, "seek": 33980, "start": 6245.64, "end": 6247.2, "text": " that really feels quite fundamental.", "tokens": [50656, 300, 534, 3417, 1596, 8088, 13, 50734], "temperature": 0.0, "avg_logprob": -0.19251574448176792, "compression_ratio": 1.632, "no_speech_prob": 0.002061157487332821}, {"id": 1739, "seek": 33980, "start": 6247.2, "end": 6251.36, "text": " So when we talk about residual connections in a transformer,", "tokens": [50734, 407, 562, 321, 751, 466, 27980, 9271, 294, 257, 31782, 11, 50942], "temperature": 0.0, "avg_logprob": -0.19251574448176792, "compression_ratio": 1.632, "no_speech_prob": 0.002061157487332821}, {"id": 1740, "seek": 33980, "start": 6251.36, "end": 6255.2, "text": " this is how all kind of vanilla transformers today work.", "tokens": [50942, 341, 307, 577, 439, 733, 295, 17528, 4088, 433, 965, 589, 13, 51134], "temperature": 0.0, "avg_logprob": -0.19251574448176792, "compression_ratio": 1.632, "no_speech_prob": 0.002061157487332821}, {"id": 1741, "seek": 33980, "start": 6255.2, "end": 6258.76, "text": " You feed an input to a layer, right?", "tokens": [51134, 509, 3154, 364, 4846, 281, 257, 4583, 11, 558, 30, 51312], "temperature": 0.0, "avg_logprob": -0.19251574448176792, "compression_ratio": 1.632, "no_speech_prob": 0.002061157487332821}, {"id": 1742, "seek": 33980, "start": 6258.76, "end": 6263.16, "text": " And that input is going to get split into two,", "tokens": [51312, 400, 300, 4846, 307, 516, 281, 483, 7472, 666, 732, 11, 51532], "temperature": 0.0, "avg_logprob": -0.19251574448176792, "compression_ratio": 1.632, "no_speech_prob": 0.002061157487332821}, {"id": 1743, "seek": 33980, "start": 6263.16, "end": 6266.0, "text": " kind of you think of like two forked branches basically", "tokens": [51532, 733, 295, 291, 519, 295, 411, 732, 17716, 292, 14770, 1936, 51674], "temperature": 0.0, "avg_logprob": -0.19251574448176792, "compression_ratio": 1.632, "no_speech_prob": 0.002061157487332821}, {"id": 1744, "seek": 33980, "start": 6266.0, "end": 6268.48, "text": " at each layer of the transformer.", "tokens": [51674, 412, 1184, 4583, 295, 264, 31782, 13, 51798], "temperature": 0.0, "avg_logprob": -0.19251574448176792, "compression_ratio": 1.632, "no_speech_prob": 0.002061157487332821}, {"id": 1745, "seek": 36848, "start": 6268.48, "end": 6271.24, "text": " So in one branch, you're gonna apply some kind of transformation", "tokens": [50364, 407, 294, 472, 9819, 11, 291, 434, 799, 3079, 512, 733, 295, 9887, 50502], "temperature": 0.0, "avg_logprob": -0.12158695967109115, "compression_ratio": 2.011111111111111, "no_speech_prob": 0.00038199921254999936}, {"id": 1746, "seek": 36848, "start": 6271.24, "end": 6273.56, "text": " like a, you know, a weight matrix or something", "tokens": [50502, 411, 257, 11, 291, 458, 11, 257, 3364, 8141, 420, 746, 50618], "temperature": 0.0, "avg_logprob": -0.12158695967109115, "compression_ratio": 2.011111111111111, "no_speech_prob": 0.00038199921254999936}, {"id": 1747, "seek": 36848, "start": 6273.56, "end": 6275.88, "text": " to like update the input.", "tokens": [50618, 281, 411, 5623, 264, 4846, 13, 50734], "temperature": 0.0, "avg_logprob": -0.12158695967109115, "compression_ratio": 2.011111111111111, "no_speech_prob": 0.00038199921254999936}, {"id": 1748, "seek": 36848, "start": 6275.88, "end": 6278.6, "text": " In the other branch, you're gonna really do nothing", "tokens": [50734, 682, 264, 661, 9819, 11, 291, 434, 799, 534, 360, 1825, 50870], "temperature": 0.0, "avg_logprob": -0.12158695967109115, "compression_ratio": 2.011111111111111, "no_speech_prob": 0.00038199921254999936}, {"id": 1749, "seek": 36848, "start": 6278.6, "end": 6280.48, "text": " and you're gonna merge those two branches together.", "tokens": [50870, 293, 291, 434, 799, 22183, 729, 732, 14770, 1214, 13, 50964], "temperature": 0.0, "avg_logprob": -0.12158695967109115, "compression_ratio": 2.011111111111111, "no_speech_prob": 0.00038199921254999936}, {"id": 1750, "seek": 36848, "start": 6280.48, "end": 6281.64, "text": " And the branch that does nothing", "tokens": [50964, 400, 264, 9819, 300, 775, 1825, 51022], "temperature": 0.0, "avg_logprob": -0.12158695967109115, "compression_ratio": 2.011111111111111, "no_speech_prob": 0.00038199921254999936}, {"id": 1751, "seek": 36848, "start": 6281.64, "end": 6284.04, "text": " is basically just passing the input right back in", "tokens": [51022, 307, 1936, 445, 8437, 264, 4846, 558, 646, 294, 51142], "temperature": 0.0, "avg_logprob": -0.12158695967109115, "compression_ratio": 2.011111111111111, "no_speech_prob": 0.00038199921254999936}, {"id": 1752, "seek": 36848, "start": 6284.04, "end": 6286.56, "text": " and you're gonna add it to the modifications", "tokens": [51142, 293, 291, 434, 799, 909, 309, 281, 264, 26881, 51268], "temperature": 0.0, "avg_logprob": -0.12158695967109115, "compression_ratio": 2.011111111111111, "no_speech_prob": 0.00038199921254999936}, {"id": 1753, "seek": 36848, "start": 6286.56, "end": 6289.6, "text": " that you applied, the weight matrix times the input", "tokens": [51268, 300, 291, 6456, 11, 264, 3364, 8141, 1413, 264, 4846, 51420], "temperature": 0.0, "avg_logprob": -0.12158695967109115, "compression_ratio": 2.011111111111111, "no_speech_prob": 0.00038199921254999936}, {"id": 1754, "seek": 36848, "start": 6289.6, "end": 6291.88, "text": " to form the output of that layer.", "tokens": [51420, 281, 1254, 264, 5598, 295, 300, 4583, 13, 51534], "temperature": 0.0, "avg_logprob": -0.12158695967109115, "compression_ratio": 2.011111111111111, "no_speech_prob": 0.00038199921254999936}, {"id": 1755, "seek": 36848, "start": 6291.88, "end": 6294.4, "text": " So your output is the initial input", "tokens": [51534, 407, 428, 5598, 307, 264, 5883, 4846, 51660], "temperature": 0.0, "avg_logprob": -0.12158695967109115, "compression_ratio": 2.011111111111111, "no_speech_prob": 0.00038199921254999936}, {"id": 1756, "seek": 36848, "start": 6294.4, "end": 6298.12, "text": " plus a transformation applied to the initial input.", "tokens": [51660, 1804, 257, 9887, 6456, 281, 264, 5883, 4846, 13, 51846], "temperature": 0.0, "avg_logprob": -0.12158695967109115, "compression_ratio": 2.011111111111111, "no_speech_prob": 0.00038199921254999936}, {"id": 1757, "seek": 39812, "start": 6298.12, "end": 6300.52, "text": " Now, one thing that this does is it means", "tokens": [50364, 823, 11, 472, 551, 300, 341, 775, 307, 309, 1355, 50484], "temperature": 0.0, "avg_logprob": -0.12911079225748995, "compression_ratio": 1.8101694915254238, "no_speech_prob": 7.278364500962198e-05}, {"id": 1758, "seek": 39812, "start": 6300.52, "end": 6302.88, "text": " at every layer, you're basically adding more.", "tokens": [50484, 412, 633, 4583, 11, 291, 434, 1936, 5127, 544, 13, 50602], "temperature": 0.0, "avg_logprob": -0.12911079225748995, "compression_ratio": 1.8101694915254238, "no_speech_prob": 7.278364500962198e-05}, {"id": 1759, "seek": 39812, "start": 6302.88, "end": 6304.88, "text": " You're adding a transformation on the input", "tokens": [50602, 509, 434, 5127, 257, 9887, 322, 264, 4846, 50702], "temperature": 0.0, "avg_logprob": -0.12911079225748995, "compression_ratio": 1.8101694915254238, "no_speech_prob": 7.278364500962198e-05}, {"id": 1760, "seek": 39812, "start": 6304.88, "end": 6306.56, "text": " to that layer to the output.", "tokens": [50702, 281, 300, 4583, 281, 264, 5598, 13, 50786], "temperature": 0.0, "avg_logprob": -0.12911079225748995, "compression_ratio": 1.8101694915254238, "no_speech_prob": 7.278364500962198e-05}, {"id": 1761, "seek": 39812, "start": 6306.56, "end": 6308.12, "text": " And so you actually get larger and larger", "tokens": [50786, 400, 370, 291, 767, 483, 4833, 293, 4833, 50864], "temperature": 0.0, "avg_logprob": -0.12911079225748995, "compression_ratio": 1.8101694915254238, "no_speech_prob": 7.278364500962198e-05}, {"id": 1762, "seek": 39812, "start": 6308.12, "end": 6310.48, "text": " and larger kind of like ballooning values", "tokens": [50864, 293, 4833, 733, 295, 411, 16994, 278, 4190, 50982], "temperature": 0.0, "avg_logprob": -0.12911079225748995, "compression_ratio": 1.8101694915254238, "no_speech_prob": 7.278364500962198e-05}, {"id": 1763, "seek": 39812, "start": 6310.48, "end": 6312.44, "text": " for these residual activations", "tokens": [50982, 337, 613, 27980, 2430, 763, 51080], "temperature": 0.0, "avg_logprob": -0.12911079225748995, "compression_ratio": 1.8101694915254238, "no_speech_prob": 7.278364500962198e-05}, {"id": 1764, "seek": 39812, "start": 6312.44, "end": 6314.44, "text": " as they work their way down the network.", "tokens": [51080, 382, 436, 589, 641, 636, 760, 264, 3209, 13, 51180], "temperature": 0.0, "avg_logprob": -0.12911079225748995, "compression_ratio": 1.8101694915254238, "no_speech_prob": 7.278364500962198e-05}, {"id": 1765, "seek": 39812, "start": 6314.44, "end": 6316.08, "text": " But that's basically the idea.", "tokens": [51180, 583, 300, 311, 1936, 264, 1558, 13, 51262], "temperature": 0.0, "avg_logprob": -0.12911079225748995, "compression_ratio": 1.8101694915254238, "no_speech_prob": 7.278364500962198e-05}, {"id": 1766, "seek": 39812, "start": 6316.08, "end": 6319.0, "text": " The whole philosophy behind this is people find", "tokens": [51262, 440, 1379, 10675, 2261, 341, 307, 561, 915, 51408], "temperature": 0.0, "avg_logprob": -0.12911079225748995, "compression_ratio": 1.8101694915254238, "no_speech_prob": 7.278364500962198e-05}, {"id": 1767, "seek": 39812, "start": 6319.0, "end": 6322.4, "text": " that when you go to do a back propagation during training,", "tokens": [51408, 300, 562, 291, 352, 281, 360, 257, 646, 38377, 1830, 3097, 11, 51578], "temperature": 0.0, "avg_logprob": -0.12911079225748995, "compression_ratio": 1.8101694915254238, "no_speech_prob": 7.278364500962198e-05}, {"id": 1768, "seek": 39812, "start": 6322.4, "end": 6325.52, "text": " basically unless you send the input to a given layer", "tokens": [51578, 1936, 5969, 291, 2845, 264, 4846, 281, 257, 2212, 4583, 51734], "temperature": 0.0, "avg_logprob": -0.12911079225748995, "compression_ratio": 1.8101694915254238, "no_speech_prob": 7.278364500962198e-05}, {"id": 1769, "seek": 39812, "start": 6325.52, "end": 6327.16, "text": " to its output in this way,", "tokens": [51734, 281, 1080, 5598, 294, 341, 636, 11, 51816], "temperature": 0.0, "avg_logprob": -0.12911079225748995, "compression_ratio": 1.8101694915254238, "no_speech_prob": 7.278364500962198e-05}, {"id": 1770, "seek": 42716, "start": 6327.16, "end": 6329.68, "text": " the model will kind of like start to forget", "tokens": [50364, 264, 2316, 486, 733, 295, 411, 722, 281, 2870, 50490], "temperature": 0.0, "avg_logprob": -0.12864575269338968, "compression_ratio": 1.765079365079365, "no_speech_prob": 6.423080048989505e-05}, {"id": 1771, "seek": 42716, "start": 6329.68, "end": 6331.28, "text": " about the input to that layer.", "tokens": [50490, 466, 264, 4846, 281, 300, 4583, 13, 50570], "temperature": 0.0, "avg_logprob": -0.12864575269338968, "compression_ratio": 1.765079365079365, "no_speech_prob": 6.423080048989505e-05}, {"id": 1772, "seek": 42716, "start": 6331.28, "end": 6334.24, "text": " That information gets lost through all of the additions", "tokens": [50570, 663, 1589, 2170, 2731, 807, 439, 295, 264, 35113, 50718], "temperature": 0.0, "avg_logprob": -0.12864575269338968, "compression_ratio": 1.765079365079365, "no_speech_prob": 6.423080048989505e-05}, {"id": 1773, "seek": 42716, "start": 6334.24, "end": 6337.2, "text": " of like these transformations", "tokens": [50718, 295, 411, 613, 34852, 50866], "temperature": 0.0, "avg_logprob": -0.12864575269338968, "compression_ratio": 1.765079365079365, "no_speech_prob": 6.423080048989505e-05}, {"id": 1774, "seek": 42716, "start": 6337.2, "end": 6339.0, "text": " over the layers of the network", "tokens": [50866, 670, 264, 7914, 295, 264, 3209, 50956], "temperature": 0.0, "avg_logprob": -0.12864575269338968, "compression_ratio": 1.765079365079365, "no_speech_prob": 6.423080048989505e-05}, {"id": 1775, "seek": 42716, "start": 6339.0, "end": 6341.52, "text": " cause earlier transformations to get forgotten.", "tokens": [50956, 3082, 3071, 34852, 281, 483, 11832, 13, 51082], "temperature": 0.0, "avg_logprob": -0.12864575269338968, "compression_ratio": 1.765079365079365, "no_speech_prob": 6.423080048989505e-05}, {"id": 1776, "seek": 42716, "start": 6341.52, "end": 6343.28, "text": " And so you're basically trying to re-inject,", "tokens": [51082, 400, 370, 291, 434, 1936, 1382, 281, 319, 12, 259, 1020, 11, 51170], "temperature": 0.0, "avg_logprob": -0.12864575269338968, "compression_ratio": 1.765079365079365, "no_speech_prob": 6.423080048989505e-05}, {"id": 1777, "seek": 42716, "start": 6343.28, "end": 6345.88, "text": " remind the model like, hey, this is what the previous layer", "tokens": [51170, 4160, 264, 2316, 411, 11, 4177, 11, 341, 307, 437, 264, 3894, 4583, 51300], "temperature": 0.0, "avg_logprob": -0.12864575269338968, "compression_ratio": 1.765079365079365, "no_speech_prob": 6.423080048989505e-05}, {"id": 1778, "seek": 42716, "start": 6345.88, "end": 6347.56, "text": " said by the way, don't forget that.", "tokens": [51300, 848, 538, 264, 636, 11, 500, 380, 2870, 300, 13, 51384], "temperature": 0.0, "avg_logprob": -0.12864575269338968, "compression_ratio": 1.765079365079365, "no_speech_prob": 6.423080048989505e-05}, {"id": 1779, "seek": 42716, "start": 6347.56, "end": 6350.28, "text": " But yes, also you can tack on your own correction,", "tokens": [51384, 583, 2086, 11, 611, 291, 393, 9426, 322, 428, 1065, 19984, 11, 51520], "temperature": 0.0, "avg_logprob": -0.12864575269338968, "compression_ratio": 1.765079365079365, "no_speech_prob": 6.423080048989505e-05}, {"id": 1780, "seek": 42716, "start": 6350.28, "end": 6351.28, "text": " your own update, right?", "tokens": [51520, 428, 1065, 5623, 11, 558, 30, 51570], "temperature": 0.0, "avg_logprob": -0.12864575269338968, "compression_ratio": 1.765079365079365, "no_speech_prob": 6.423080048989505e-05}, {"id": 1781, "seek": 42716, "start": 6351.28, "end": 6353.84, "text": " So that's kind of how standard residual connections work.", "tokens": [51570, 407, 300, 311, 733, 295, 577, 3832, 27980, 9271, 589, 13, 51698], "temperature": 0.0, "avg_logprob": -0.12864575269338968, "compression_ratio": 1.765079365079365, "no_speech_prob": 6.423080048989505e-05}, {"id": 1782, "seek": 42716, "start": 6353.84, "end": 6355.92, "text": " And there's a whole approach to doing this", "tokens": [51698, 400, 456, 311, 257, 1379, 3109, 281, 884, 341, 51802], "temperature": 0.0, "avg_logprob": -0.12864575269338968, "compression_ratio": 1.765079365079365, "no_speech_prob": 6.423080048989505e-05}, {"id": 1783, "seek": 45592, "start": 6355.92, "end": 6359.48, "text": " and when and how you normalize these sums", "tokens": [50364, 293, 562, 293, 577, 291, 2710, 1125, 613, 34499, 50542], "temperature": 0.0, "avg_logprob": -0.1944242429101106, "compression_ratio": 1.7555555555555555, "no_speech_prob": 0.0004896061727777123}, {"id": 1784, "seek": 45592, "start": 6359.48, "end": 6361.24, "text": " and all that stuff that we're not going to get into.", "tokens": [50542, 293, 439, 300, 1507, 300, 321, 434, 406, 516, 281, 483, 666, 13, 50630], "temperature": 0.0, "avg_logprob": -0.1944242429101106, "compression_ratio": 1.7555555555555555, "no_speech_prob": 0.0004896061727777123}, {"id": 1785, "seek": 45592, "start": 6361.24, "end": 6363.48, "text": " The key thing here is though,", "tokens": [50630, 440, 2141, 551, 510, 307, 1673, 11, 50742], "temperature": 0.0, "avg_logprob": -0.1944242429101106, "compression_ratio": 1.7555555555555555, "no_speech_prob": 0.0004896061727777123}, {"id": 1786, "seek": 45592, "start": 6363.48, "end": 6368.48, "text": " that this process sort of waits every layer kind of equivalently.", "tokens": [50742, 300, 341, 1399, 1333, 295, 40597, 633, 4583, 733, 295, 9052, 2276, 13, 50992], "temperature": 0.0, "avg_logprob": -0.1944242429101106, "compression_ratio": 1.7555555555555555, "no_speech_prob": 0.0004896061727777123}, {"id": 1787, "seek": 45592, "start": 6371.8, "end": 6373.0, "text": " So if you think about it,", "tokens": [51158, 407, 498, 291, 519, 466, 309, 11, 51218], "temperature": 0.0, "avg_logprob": -0.1944242429101106, "compression_ratio": 1.7555555555555555, "no_speech_prob": 0.0004896061727777123}, {"id": 1788, "seek": 45592, "start": 6373.0, "end": 6375.2, "text": " layer one is going to take its own input", "tokens": [51218, 4583, 472, 307, 516, 281, 747, 1080, 1065, 4846, 51328], "temperature": 0.0, "avg_logprob": -0.1944242429101106, "compression_ratio": 1.7555555555555555, "no_speech_prob": 0.0004896061727777123}, {"id": 1789, "seek": 45592, "start": 6375.2, "end": 6378.44, "text": " and then add that to its own input times its contribution.", "tokens": [51328, 293, 550, 909, 300, 281, 1080, 1065, 4846, 1413, 1080, 13150, 13, 51490], "temperature": 0.0, "avg_logprob": -0.1944242429101106, "compression_ratio": 1.7555555555555555, "no_speech_prob": 0.0004896061727777123}, {"id": 1790, "seek": 45592, "start": 6378.44, "end": 6380.08, "text": " It's weight matrix, so it's going to use", "tokens": [51490, 467, 311, 3364, 8141, 11, 370, 309, 311, 516, 281, 764, 51572], "temperature": 0.0, "avg_logprob": -0.1944242429101106, "compression_ratio": 1.7555555555555555, "no_speech_prob": 0.0004896061727777123}, {"id": 1791, "seek": 45592, "start": 6380.08, "end": 6382.08, "text": " to modify the input and that'll be the output", "tokens": [51572, 281, 16927, 264, 4846, 293, 300, 603, 312, 264, 5598, 51672], "temperature": 0.0, "avg_logprob": -0.1944242429101106, "compression_ratio": 1.7555555555555555, "no_speech_prob": 0.0004896061727777123}, {"id": 1792, "seek": 45592, "start": 6382.08, "end": 6383.28, "text": " that goes to layer two, right?", "tokens": [51672, 300, 1709, 281, 4583, 732, 11, 558, 30, 51732], "temperature": 0.0, "avg_logprob": -0.1944242429101106, "compression_ratio": 1.7555555555555555, "no_speech_prob": 0.0004896061727777123}, {"id": 1793, "seek": 45592, "start": 6383.28, "end": 6384.6, "text": " And then layer two does the same thing.", "tokens": [51732, 400, 550, 4583, 732, 775, 264, 912, 551, 13, 51798], "temperature": 0.0, "avg_logprob": -0.1944242429101106, "compression_ratio": 1.7555555555555555, "no_speech_prob": 0.0004896061727777123}, {"id": 1794, "seek": 48460, "start": 6384.92, "end": 6388.4, "text": " And each layer is just kind of like adding its own contribution", "tokens": [50380, 400, 1184, 4583, 307, 445, 733, 295, 411, 5127, 1080, 1065, 13150, 50554], "temperature": 0.0, "avg_logprob": -0.15073184531275965, "compression_ratio": 1.707142857142857, "no_speech_prob": 0.0003166214155498892}, {"id": 1795, "seek": 48460, "start": 6388.4, "end": 6393.4, "text": " in this very uniform accumulation of information down the line.", "tokens": [50554, 294, 341, 588, 9452, 35647, 295, 1589, 760, 264, 1622, 13, 50804], "temperature": 0.0, "avg_logprob": -0.15073184531275965, "compression_ratio": 1.707142857142857, "no_speech_prob": 0.0003166214155498892}, {"id": 1796, "seek": 48460, "start": 6393.88, "end": 6397.08, "text": " And there's no sense in which layer three", "tokens": [50828, 400, 456, 311, 572, 2020, 294, 597, 4583, 1045, 50988], "temperature": 0.0, "avg_logprob": -0.15073184531275965, "compression_ratio": 1.707142857142857, "no_speech_prob": 0.0003166214155498892}, {"id": 1797, "seek": 48460, "start": 6397.08, "end": 6399.56, "text": " is any more important than layer 17.", "tokens": [50988, 307, 604, 544, 1021, 813, 4583, 3282, 13, 51112], "temperature": 0.0, "avg_logprob": -0.15073184531275965, "compression_ratio": 1.707142857142857, "no_speech_prob": 0.0003166214155498892}, {"id": 1798, "seek": 48460, "start": 6399.56, "end": 6402.12, "text": " Now the thing is in some cases, that will be the case, right?", "tokens": [51112, 823, 264, 551, 307, 294, 512, 3331, 11, 300, 486, 312, 264, 1389, 11, 558, 30, 51240], "temperature": 0.0, "avg_logprob": -0.15073184531275965, "compression_ratio": 1.707142857142857, "no_speech_prob": 0.0003166214155498892}, {"id": 1799, "seek": 48460, "start": 6402.12, "end": 6405.04, "text": " You will have cases where one layer is actually more important", "tokens": [51240, 509, 486, 362, 3331, 689, 472, 4583, 307, 767, 544, 1021, 51386], "temperature": 0.0, "avg_logprob": -0.15073184531275965, "compression_ratio": 1.707142857142857, "no_speech_prob": 0.0003166214155498892}, {"id": 1800, "seek": 48460, "start": 6405.04, "end": 6407.64, "text": " for this particular prompt than another, right?", "tokens": [51386, 337, 341, 1729, 12391, 813, 1071, 11, 558, 30, 51516], "temperature": 0.0, "avg_logprob": -0.15073184531275965, "compression_ratio": 1.707142857142857, "no_speech_prob": 0.0003166214155498892}, {"id": 1801, "seek": 48460, "start": 6407.64, "end": 6409.76, "text": " It's just like, you know, maybe this is a layer", "tokens": [51516, 467, 311, 445, 411, 11, 291, 458, 11, 1310, 341, 307, 257, 4583, 51622], "temperature": 0.0, "avg_logprob": -0.15073184531275965, "compression_ratio": 1.707142857142857, "no_speech_prob": 0.0003166214155498892}, {"id": 1802, "seek": 48460, "start": 6409.76, "end": 6412.36, "text": " that tends to worry about grammar rules or syntax,", "tokens": [51622, 300, 12258, 281, 3292, 466, 22317, 4474, 420, 28431, 11, 51752], "temperature": 0.0, "avg_logprob": -0.15073184531275965, "compression_ratio": 1.707142857142857, "no_speech_prob": 0.0003166214155498892}, {"id": 1803, "seek": 51236, "start": 6412.36, "end": 6415.32, "text": " something very basic that you would tend to find in an earlier layer", "tokens": [50364, 746, 588, 3875, 300, 291, 576, 3928, 281, 915, 294, 364, 3071, 4583, 50512], "temperature": 0.0, "avg_logprob": -0.16646984411824134, "compression_ratio": 1.7659574468085106, "no_speech_prob": 0.004098682664334774}, {"id": 1804, "seek": 51236, "start": 6415.32, "end": 6418.0, "text": " and maybe this is, you know, you're on a token", "tokens": [50512, 293, 1310, 341, 307, 11, 291, 458, 11, 291, 434, 322, 257, 14862, 50646], "temperature": 0.0, "avg_logprob": -0.16646984411824134, "compression_ratio": 1.7659574468085106, "no_speech_prob": 0.004098682664334774}, {"id": 1805, "seek": 51236, "start": 6418.0, "end": 6420.8, "text": " that involves pluralization or some grammar rule, right?", "tokens": [50646, 300, 11626, 25377, 2144, 420, 512, 22317, 4978, 11, 558, 30, 50786], "temperature": 0.0, "avg_logprob": -0.16646984411824134, "compression_ratio": 1.7659574468085106, "no_speech_prob": 0.004098682664334774}, {"id": 1806, "seek": 51236, "start": 6420.8, "end": 6421.72, "text": " Where it's really relevant.", "tokens": [50786, 2305, 309, 311, 534, 7340, 13, 50832], "temperature": 0.0, "avg_logprob": -0.16646984411824134, "compression_ratio": 1.7659574468085106, "no_speech_prob": 0.004098682664334774}, {"id": 1807, "seek": 51236, "start": 6421.72, "end": 6423.84, "text": " So you'd want to be, you would almost want to be able", "tokens": [50832, 407, 291, 1116, 528, 281, 312, 11, 291, 576, 1920, 528, 281, 312, 1075, 50938], "temperature": 0.0, "avg_logprob": -0.16646984411824134, "compression_ratio": 1.7659574468085106, "no_speech_prob": 0.004098682664334774}, {"id": 1808, "seek": 51236, "start": 6423.84, "end": 6426.4400000000005, "text": " to pay more attention to, that's the hint,", "tokens": [50938, 281, 1689, 544, 3202, 281, 11, 300, 311, 264, 12075, 11, 51068], "temperature": 0.0, "avg_logprob": -0.16646984411824134, "compression_ratio": 1.7659574468085106, "no_speech_prob": 0.004098682664334774}, {"id": 1809, "seek": 51236, "start": 6426.4400000000005, "end": 6429.76, "text": " attend more to a given layer than another.", "tokens": [51068, 6888, 544, 281, 257, 2212, 4583, 813, 1071, 13, 51234], "temperature": 0.0, "avg_logprob": -0.16646984411824134, "compression_ratio": 1.7659574468085106, "no_speech_prob": 0.004098682664334774}, {"id": 1810, "seek": 51236, "start": 6429.76, "end": 6433.88, "text": " And transformers in this or standard residual connection sense", "tokens": [51234, 400, 4088, 433, 294, 341, 420, 3832, 27980, 4984, 2020, 51440], "temperature": 0.0, "avg_logprob": -0.16646984411824134, "compression_ratio": 1.7659574468085106, "no_speech_prob": 0.004098682664334774}, {"id": 1811, "seek": 51236, "start": 6433.88, "end": 6434.96, "text": " don't do that, right?", "tokens": [51440, 500, 380, 360, 300, 11, 558, 30, 51494], "temperature": 0.0, "avg_logprob": -0.16646984411824134, "compression_ratio": 1.7659574468085106, "no_speech_prob": 0.004098682664334774}, {"id": 1812, "seek": 51236, "start": 6434.96, "end": 6437.4400000000005, "text": " Again, they just kind of all, you know, take their input,", "tokens": [51494, 3764, 11, 436, 445, 733, 295, 439, 11, 291, 458, 11, 747, 641, 4846, 11, 51618], "temperature": 0.0, "avg_logprob": -0.16646984411824134, "compression_ratio": 1.7659574468085106, "no_speech_prob": 0.004098682664334774}, {"id": 1813, "seek": 51236, "start": 6437.4400000000005, "end": 6439.36, "text": " they pass it on plus the input times", "tokens": [51618, 436, 1320, 309, 322, 1804, 264, 4846, 1413, 51714], "temperature": 0.0, "avg_logprob": -0.16646984411824134, "compression_ratio": 1.7659574468085106, "no_speech_prob": 0.004098682664334774}, {"id": 1814, "seek": 51236, "start": 6439.36, "end": 6441.8, "text": " whatever their contribution is and they pass it down a line.", "tokens": [51714, 2035, 641, 13150, 307, 293, 436, 1320, 309, 760, 257, 1622, 13, 51836], "temperature": 0.0, "avg_logprob": -0.16646984411824134, "compression_ratio": 1.7659574468085106, "no_speech_prob": 0.004098682664334774}, {"id": 1815, "seek": 54180, "start": 6442.12, "end": 6443.8, "text": " There's no, there's no kind of difference", "tokens": [50380, 821, 311, 572, 11, 456, 311, 572, 733, 295, 2649, 50464], "temperature": 0.0, "avg_logprob": -0.11581167689076176, "compression_ratio": 1.796551724137931, "no_speech_prob": 0.0007398673333227634}, {"id": 1816, "seek": 54180, "start": 6443.8, "end": 6447.28, "text": " between the sort of impact of any given layer.", "tokens": [50464, 1296, 264, 1333, 295, 2712, 295, 604, 2212, 4583, 13, 50638], "temperature": 0.0, "avg_logprob": -0.11581167689076176, "compression_ratio": 1.796551724137931, "no_speech_prob": 0.0007398673333227634}, {"id": 1817, "seek": 54180, "start": 6447.28, "end": 6449.2, "text": " There's certainly no intelligent difference.", "tokens": [50638, 821, 311, 3297, 572, 13232, 2649, 13, 50734], "temperature": 0.0, "avg_logprob": -0.11581167689076176, "compression_ratio": 1.796551724137931, "no_speech_prob": 0.0007398673333227634}, {"id": 1818, "seek": 54180, "start": 6449.2, "end": 6451.84, "text": " And this is what this paper is going to try to change, right?", "tokens": [50734, 400, 341, 307, 437, 341, 3035, 307, 516, 281, 853, 281, 1319, 11, 558, 30, 50866], "temperature": 0.0, "avg_logprob": -0.11581167689076176, "compression_ratio": 1.796551724137931, "no_speech_prob": 0.0007398673333227634}, {"id": 1819, "seek": 54180, "start": 6451.84, "end": 6453.8, "text": " They call it attention residuals", "tokens": [50866, 814, 818, 309, 3202, 27980, 82, 50964], "temperature": 0.0, "avg_logprob": -0.11581167689076176, "compression_ratio": 1.796551724137931, "no_speech_prob": 0.0007398673333227634}, {"id": 1820, "seek": 54180, "start": 6453.8, "end": 6456.64, "text": " and they're going to replace this whole fixed accumulation", "tokens": [50964, 293, 436, 434, 516, 281, 7406, 341, 1379, 6806, 35647, 51106], "temperature": 0.0, "avg_logprob": -0.11581167689076176, "compression_ratio": 1.796551724137931, "no_speech_prob": 0.0007398673333227634}, {"id": 1821, "seek": 54180, "start": 6456.64, "end": 6458.96, "text": " of information with softmax attention.", "tokens": [51106, 295, 1589, 365, 2787, 41167, 3202, 13, 51222], "temperature": 0.0, "avg_logprob": -0.11581167689076176, "compression_ratio": 1.796551724137931, "no_speech_prob": 0.0007398673333227634}, {"id": 1822, "seek": 54180, "start": 6458.96, "end": 6461.32, "text": " It's basically just a way of saying, you know,", "tokens": [51222, 467, 311, 1936, 445, 257, 636, 295, 1566, 11, 291, 458, 11, 51340], "temperature": 0.0, "avg_logprob": -0.11581167689076176, "compression_ratio": 1.796551724137931, "no_speech_prob": 0.0007398673333227634}, {"id": 1823, "seek": 54180, "start": 6461.32, "end": 6465.36, "text": " we're going to have a model essentially", "tokens": [51340, 321, 434, 516, 281, 362, 257, 2316, 4476, 51542], "temperature": 0.0, "avg_logprob": -0.11581167689076176, "compression_ratio": 1.796551724137931, "no_speech_prob": 0.0007398673333227634}, {"id": 1824, "seek": 54180, "start": 6465.36, "end": 6469.04, "text": " that looks at our layers and has a kind of attention operation", "tokens": [51542, 300, 1542, 412, 527, 7914, 293, 575, 257, 733, 295, 3202, 6916, 51726], "temperature": 0.0, "avg_logprob": -0.11581167689076176, "compression_ratio": 1.796551724137931, "no_speech_prob": 0.0007398673333227634}, {"id": 1825, "seek": 54180, "start": 6469.04, "end": 6471.24, "text": " that it's going to do to say, hey, you know,", "tokens": [51726, 300, 309, 311, 516, 281, 360, 281, 584, 11, 4177, 11, 291, 458, 11, 51836], "temperature": 0.0, "avg_logprob": -0.11581167689076176, "compression_ratio": 1.796551724137931, "no_speech_prob": 0.0007398673333227634}, {"id": 1826, "seek": 57124, "start": 6471.24, "end": 6473.6, "text": " you should be attending more to this layer than this layer.", "tokens": [50364, 291, 820, 312, 15862, 544, 281, 341, 4583, 813, 341, 4583, 13, 50482], "temperature": 0.0, "avg_logprob": -0.13566025317861483, "compression_ratio": 1.729559748427673, "no_speech_prob": 0.0010198201052844524}, {"id": 1827, "seek": 57124, "start": 6473.6, "end": 6476.32, "text": " That's the kind of 30,000 foot view.", "tokens": [50482, 663, 311, 264, 733, 295, 2217, 11, 1360, 2671, 1910, 13, 50618], "temperature": 0.0, "avg_logprob": -0.13566025317861483, "compression_ratio": 1.729559748427673, "no_speech_prob": 0.0010198201052844524}, {"id": 1828, "seek": 57124, "start": 6476.32, "end": 6479.2, "text": " There's a bunch of details that fall out of this.", "tokens": [50618, 821, 311, 257, 3840, 295, 4365, 300, 2100, 484, 295, 341, 13, 50762], "temperature": 0.0, "avg_logprob": -0.13566025317861483, "compression_ratio": 1.729559748427673, "no_speech_prob": 0.0010198201052844524}, {"id": 1829, "seek": 57124, "start": 6479.2, "end": 6480.8, "text": " The math is actually quite simple.", "tokens": [50762, 440, 5221, 307, 767, 1596, 2199, 13, 50842], "temperature": 0.0, "avg_logprob": -0.13566025317861483, "compression_ratio": 1.729559748427673, "no_speech_prob": 0.0010198201052844524}, {"id": 1830, "seek": 57124, "start": 6480.8, "end": 6481.88, "text": " I recommend checking it out.", "tokens": [50842, 286, 2748, 8568, 309, 484, 13, 50896], "temperature": 0.0, "avg_logprob": -0.13566025317861483, "compression_ratio": 1.729559748427673, "no_speech_prob": 0.0010198201052844524}, {"id": 1831, "seek": 57124, "start": 6481.88, "end": 6484.36, "text": " I'm, by the way, give me feedback on this guys, by the way,", "tokens": [50896, 286, 478, 11, 538, 264, 636, 11, 976, 385, 5824, 322, 341, 1074, 11, 538, 264, 636, 11, 51020], "temperature": 0.0, "avg_logprob": -0.13566025317861483, "compression_ratio": 1.729559748427673, "no_speech_prob": 0.0010198201052844524}, {"id": 1832, "seek": 57124, "start": 6484.36, "end": 6486.8, "text": " because I'm deliberately trying to be a little more concise", "tokens": [51020, 570, 286, 478, 23506, 1382, 281, 312, 257, 707, 544, 44882, 51142], "temperature": 0.0, "avg_logprob": -0.13566025317861483, "compression_ratio": 1.729559748427673, "no_speech_prob": 0.0010198201052844524}, {"id": 1833, "seek": 57124, "start": 6486.8, "end": 6489.0, "text": " than I was last episode to not give you", "tokens": [51142, 813, 286, 390, 1036, 3500, 281, 406, 976, 291, 51252], "temperature": 0.0, "avg_logprob": -0.13566025317861483, "compression_ratio": 1.729559748427673, "no_speech_prob": 0.0010198201052844524}, {"id": 1834, "seek": 57124, "start": 6489.0, "end": 6490.8, "text": " an hour-long summary of this paper.", "tokens": [51252, 364, 1773, 12, 13025, 12691, 295, 341, 3035, 13, 51342], "temperature": 0.0, "avg_logprob": -0.13566025317861483, "compression_ratio": 1.729559748427673, "no_speech_prob": 0.0010198201052844524}, {"id": 1835, "seek": 57124, "start": 6490.8, "end": 6492.24, "text": " But roughly speaking, that's it.", "tokens": [51342, 583, 9810, 4124, 11, 300, 311, 309, 13, 51414], "temperature": 0.0, "avg_logprob": -0.13566025317861483, "compression_ratio": 1.729559748427673, "no_speech_prob": 0.0010198201052844524}, {"id": 1836, "seek": 57124, "start": 6492.24, "end": 6493.92, "text": " Could go into more of the math if that's useful.", "tokens": [51414, 7497, 352, 666, 544, 295, 264, 5221, 498, 300, 311, 4420, 13, 51498], "temperature": 0.0, "avg_logprob": -0.13566025317861483, "compression_ratio": 1.729559748427673, "no_speech_prob": 0.0010198201052844524}, {"id": 1837, "seek": 57124, "start": 6493.92, "end": 6494.76, "text": " Give me that feedback.", "tokens": [51498, 5303, 385, 300, 5824, 13, 51540], "temperature": 0.0, "avg_logprob": -0.13566025317861483, "compression_ratio": 1.729559748427673, "no_speech_prob": 0.0010198201052844524}, {"id": 1838, "seek": 57124, "start": 6494.76, "end": 6497.08, "text": " But one of the challenges that you get", "tokens": [51540, 583, 472, 295, 264, 4759, 300, 291, 483, 51656], "temperature": 0.0, "avg_logprob": -0.13566025317861483, "compression_ratio": 1.729559748427673, "no_speech_prob": 0.0010198201052844524}, {"id": 1839, "seek": 59708, "start": 6497.08, "end": 6501.84, "text": " with the sort of full attention residual strategy", "tokens": [50364, 365, 264, 1333, 295, 1577, 3202, 27980, 5206, 50602], "temperature": 0.0, "avg_logprob": -0.17871018000475064, "compression_ratio": 1.8243727598566308, "no_speech_prob": 0.009230206720530987}, {"id": 1840, "seek": 59708, "start": 6501.84, "end": 6503.84, "text": " that I've just described, where you try to attend more", "tokens": [50602, 300, 286, 600, 445, 7619, 11, 689, 291, 853, 281, 6888, 544, 50702], "temperature": 0.0, "avg_logprob": -0.17871018000475064, "compression_ratio": 1.8243727598566308, "no_speech_prob": 0.009230206720530987}, {"id": 1841, "seek": 59708, "start": 6503.84, "end": 6506.92, "text": " to one layer to the another, is that it's super memory", "tokens": [50702, 281, 472, 4583, 281, 264, 1071, 11, 307, 300, 309, 311, 1687, 4675, 50856], "temperature": 0.0, "avg_logprob": -0.17871018000475064, "compression_ratio": 1.8243727598566308, "no_speech_prob": 0.009230206720530987}, {"id": 1842, "seek": 59708, "start": 6506.92, "end": 6509.76, "text": " hungry at scale, because you have to keep all", "tokens": [50856, 8067, 412, 4373, 11, 570, 291, 362, 281, 1066, 439, 50998], "temperature": 0.0, "avg_logprob": -0.17871018000475064, "compression_ratio": 1.8243727598566308, "no_speech_prob": 0.009230206720530987}, {"id": 1843, "seek": 59708, "start": 6509.76, "end": 6514.2, "text": " of the layer outputs alive in memory simultaneously.", "tokens": [50998, 295, 264, 4583, 23930, 5465, 294, 4675, 16561, 13, 51220], "temperature": 0.0, "avg_logprob": -0.17871018000475064, "compression_ratio": 1.8243727598566308, "no_speech_prob": 0.009230206720530987}, {"id": 1844, "seek": 59708, "start": 6514.2, "end": 6516.92, "text": " So, you know, you compute like the output of layer one", "tokens": [51220, 407, 11, 291, 458, 11, 291, 14722, 411, 264, 5598, 295, 4583, 472, 51356], "temperature": 0.0, "avg_logprob": -0.17871018000475064, "compression_ratio": 1.8243727598566308, "no_speech_prob": 0.009230206720530987}, {"id": 1845, "seek": 59708, "start": 6516.92, "end": 6519.08, "text": " and then the upper layer two and so on.", "tokens": [51356, 293, 550, 264, 6597, 4583, 732, 293, 370, 322, 13, 51464], "temperature": 0.0, "avg_logprob": -0.17871018000475064, "compression_ratio": 1.8243727598566308, "no_speech_prob": 0.009230206720530987}, {"id": 1846, "seek": 59708, "start": 6519.08, "end": 6521.04, "text": " And you're going to do attention over all those layers,", "tokens": [51464, 400, 291, 434, 516, 281, 360, 3202, 670, 439, 729, 7914, 11, 51562], "temperature": 0.0, "avg_logprob": -0.17871018000475064, "compression_ratio": 1.8243727598566308, "no_speech_prob": 0.009230206720530987}, {"id": 1847, "seek": 59708, "start": 6521.04, "end": 6524.4400000000005, "text": " which means you need to keep those outputs in memory", "tokens": [51562, 597, 1355, 291, 643, 281, 1066, 729, 23930, 294, 4675, 51732], "temperature": 0.0, "avg_logprob": -0.17871018000475064, "compression_ratio": 1.8243727598566308, "no_speech_prob": 0.009230206720530987}, {"id": 1848, "seek": 59708, "start": 6524.4400000000005, "end": 6526.32, "text": " so that you can run your attention calculation", "tokens": [51732, 370, 300, 291, 393, 1190, 428, 3202, 17108, 51826], "temperature": 0.0, "avg_logprob": -0.17871018000475064, "compression_ratio": 1.8243727598566308, "no_speech_prob": 0.009230206720530987}, {"id": 1849, "seek": 62632, "start": 6526.36, "end": 6529.88, "text": " and decide how much more or less to wait one layer than another.", "tokens": [50366, 293, 4536, 577, 709, 544, 420, 1570, 281, 1699, 472, 4583, 813, 1071, 13, 50542], "temperature": 0.0, "avg_logprob": -0.13755460754036902, "compression_ratio": 1.753787878787879, "no_speech_prob": 0.0028222952969372272}, {"id": 1850, "seek": 62632, "start": 6529.88, "end": 6534.2, "text": " And so, you end up with this really big sort of memory cost", "tokens": [50542, 400, 370, 11, 291, 917, 493, 365, 341, 534, 955, 1333, 295, 4675, 2063, 50758], "temperature": 0.0, "avg_logprob": -0.13755460754036902, "compression_ratio": 1.753787878787879, "no_speech_prob": 0.0028222952969372272}, {"id": 1851, "seek": 62632, "start": 6534.2, "end": 6537.4400000000005, "text": " and that scales with like the number of layers.", "tokens": [50758, 293, 300, 17408, 365, 411, 264, 1230, 295, 7914, 13, 50920], "temperature": 0.0, "avg_logprob": -0.13755460754036902, "compression_ratio": 1.753787878787879, "no_speech_prob": 0.0028222952969372272}, {"id": 1852, "seek": 62632, "start": 6537.4400000000005, "end": 6540.52, "text": " And for large models, especially when you have pipeline", "tokens": [50920, 400, 337, 2416, 5245, 11, 2318, 562, 291, 362, 15517, 51074], "temperature": 0.0, "avg_logprob": -0.13755460754036902, "compression_ratio": 1.753787878787879, "no_speech_prob": 0.0028222952969372272}, {"id": 1853, "seek": 62632, "start": 6540.52, "end": 6543.88, "text": " parallelism, that's just like super prohibitive.", "tokens": [51074, 8952, 1434, 11, 300, 311, 445, 411, 1687, 16015, 2187, 13, 51242], "temperature": 0.0, "avg_logprob": -0.13755460754036902, "compression_ratio": 1.753787878787879, "no_speech_prob": 0.0028222952969372272}, {"id": 1854, "seek": 62632, "start": 6543.88, "end": 6545.8, "text": " Pipeline parallelism is this idea where you basically", "tokens": [51242, 35396, 5440, 8952, 1434, 307, 341, 1558, 689, 291, 1936, 51338], "temperature": 0.0, "avg_logprob": -0.13755460754036902, "compression_ratio": 1.753787878787879, "no_speech_prob": 0.0028222952969372272}, {"id": 1855, "seek": 62632, "start": 6545.8, "end": 6549.08, "text": " break your model into chunks, where layers one to three", "tokens": [51338, 1821, 428, 2316, 666, 24004, 11, 689, 7914, 472, 281, 1045, 51502], "temperature": 0.0, "avg_logprob": -0.13755460754036902, "compression_ratio": 1.753787878787879, "no_speech_prob": 0.0028222952969372272}, {"id": 1856, "seek": 62632, "start": 6549.08, "end": 6552.6, "text": " sit on this GPU, layers four to six sit on this GPU", "tokens": [51502, 1394, 322, 341, 18407, 11, 7914, 1451, 281, 2309, 1394, 322, 341, 18407, 51678], "temperature": 0.0, "avg_logprob": -0.13755460754036902, "compression_ratio": 1.753787878787879, "no_speech_prob": 0.0028222952969372272}, {"id": 1857, "seek": 62632, "start": 6552.6, "end": 6553.96, "text": " and so on and so forth.", "tokens": [51678, 293, 370, 322, 293, 370, 5220, 13, 51746], "temperature": 0.0, "avg_logprob": -0.13755460754036902, "compression_ratio": 1.753787878787879, "no_speech_prob": 0.0028222952969372272}, {"id": 1858, "seek": 65396, "start": 6553.96, "end": 6555.88, "text": " For various reasons, this creates a bunch", "tokens": [50364, 1171, 3683, 4112, 11, 341, 7829, 257, 3840, 50460], "temperature": 0.0, "avg_logprob": -0.18381178379058838, "compression_ratio": 1.8345323741007193, "no_speech_prob": 0.0012687352718785405}, {"id": 1859, "seek": 65396, "start": 6555.88, "end": 6557.28, "text": " of communication bottlenecks.", "tokens": [50460, 295, 6101, 44641, 2761, 13, 50530], "temperature": 0.0, "avg_logprob": -0.18381178379058838, "compression_ratio": 1.8345323741007193, "no_speech_prob": 0.0012687352718785405}, {"id": 1860, "seek": 65396, "start": 6557.28, "end": 6561.12, "text": " So, they work on solving for those communication bottlenecks", "tokens": [50530, 407, 11, 436, 589, 322, 12606, 337, 729, 6101, 44641, 2761, 50722], "temperature": 0.0, "avg_logprob": -0.18381178379058838, "compression_ratio": 1.8345323741007193, "no_speech_prob": 0.0012687352718785405}, {"id": 1861, "seek": 65396, "start": 6561.12, "end": 6564.8, "text": " and their solution is called block attention residuals.", "tokens": [50722, 293, 641, 3827, 307, 1219, 3461, 3202, 27980, 82, 13, 50906], "temperature": 0.0, "avg_logprob": -0.18381178379058838, "compression_ratio": 1.8345323741007193, "no_speech_prob": 0.0012687352718785405}, {"id": 1862, "seek": 65396, "start": 6564.8, "end": 6566.84, "text": " And basically what they do here is,", "tokens": [50906, 400, 1936, 437, 436, 360, 510, 307, 11, 51008], "temperature": 0.0, "avg_logprob": -0.18381178379058838, "compression_ratio": 1.8345323741007193, "no_speech_prob": 0.0012687352718785405}, {"id": 1863, "seek": 65396, "start": 6566.84, "end": 6569.28, "text": " they will take a group of layers.", "tokens": [51008, 436, 486, 747, 257, 1594, 295, 7914, 13, 51130], "temperature": 0.0, "avg_logprob": -0.18381178379058838, "compression_ratio": 1.8345323741007193, "no_speech_prob": 0.0012687352718785405}, {"id": 1864, "seek": 65396, "start": 6569.28, "end": 6570.88, "text": " So, they'll basically break up the model", "tokens": [51130, 407, 11, 436, 603, 1936, 1821, 493, 264, 2316, 51210], "temperature": 0.0, "avg_logprob": -0.18381178379058838, "compression_ratio": 1.8345323741007193, "no_speech_prob": 0.0012687352718785405}, {"id": 1865, "seek": 65396, "start": 6570.88, "end": 6574.76, "text": " into n blocks of layers and say, you know, you've got like,", "tokens": [51210, 666, 297, 8474, 295, 7914, 293, 584, 11, 291, 458, 11, 291, 600, 658, 411, 11, 51404], "temperature": 0.0, "avg_logprob": -0.18381178379058838, "compression_ratio": 1.8345323741007193, "no_speech_prob": 0.0012687352718785405}, {"id": 1866, "seek": 65396, "start": 6574.76, "end": 6576.68, "text": " say eight blocks in total or something.", "tokens": [51404, 584, 3180, 8474, 294, 3217, 420, 746, 13, 51500], "temperature": 0.0, "avg_logprob": -0.18381178379058838, "compression_ratio": 1.8345323741007193, "no_speech_prob": 0.0012687352718785405}, {"id": 1867, "seek": 65396, "start": 6576.68, "end": 6579.6, "text": " And then they're actually going to compress each block", "tokens": [51500, 400, 550, 436, 434, 767, 516, 281, 14778, 1184, 3461, 51646], "temperature": 0.0, "avg_logprob": -0.18381178379058838, "compression_ratio": 1.8345323741007193, "no_speech_prob": 0.0012687352718785405}, {"id": 1868, "seek": 65396, "start": 6579.6, "end": 6583.24, "text": " into a single summary vector and then they'll basically", "tokens": [51646, 666, 257, 2167, 12691, 8062, 293, 550, 436, 603, 1936, 51828], "temperature": 0.0, "avg_logprob": -0.18381178379058838, "compression_ratio": 1.8345323741007193, "no_speech_prob": 0.0012687352718785405}, {"id": 1869, "seek": 68324, "start": 6583.24, "end": 6586.48, "text": " apply attention only on the n block level summaries.", "tokens": [50364, 3079, 3202, 787, 322, 264, 297, 3461, 1496, 8367, 4889, 13, 50526], "temperature": 0.0, "avg_logprob": -0.15973405264042043, "compression_ratio": 1.8130563798219586, "no_speech_prob": 0.0014983630971983075}, {"id": 1870, "seek": 68324, "start": 6586.48, "end": 6589.88, "text": " And then that drops the memory overhead to basically", "tokens": [50526, 400, 550, 300, 11438, 264, 4675, 19922, 281, 1936, 50696], "temperature": 0.0, "avg_logprob": -0.15973405264042043, "compression_ratio": 1.8130563798219586, "no_speech_prob": 0.0014983630971983075}, {"id": 1871, "seek": 68324, "start": 6589.88, "end": 6591.2, "text": " the number of blocks.", "tokens": [50696, 264, 1230, 295, 8474, 13, 50762], "temperature": 0.0, "avg_logprob": -0.15973405264042043, "compression_ratio": 1.8130563798219586, "no_speech_prob": 0.0014983630971983075}, {"id": 1872, "seek": 68324, "start": 6591.2, "end": 6593.12, "text": " It scales the number of blocks rather than the number of layers.", "tokens": [50762, 467, 17408, 264, 1230, 295, 8474, 2831, 813, 264, 1230, 295, 7914, 13, 50858], "temperature": 0.0, "avg_logprob": -0.15973405264042043, "compression_ratio": 1.8130563798219586, "no_speech_prob": 0.0014983630971983075}, {"id": 1873, "seek": 68324, "start": 6593.12, "end": 6594.04, "text": " So, it gives you more control there.", "tokens": [50858, 407, 11, 309, 2709, 291, 544, 1969, 456, 13, 50904], "temperature": 0.0, "avg_logprob": -0.15973405264042043, "compression_ratio": 1.8130563798219586, "no_speech_prob": 0.0014983630971983075}, {"id": 1874, "seek": 68324, "start": 6594.04, "end": 6595.88, "text": " Okay, bunch more details.", "tokens": [50904, 1033, 11, 3840, 544, 4365, 13, 50996], "temperature": 0.0, "avg_logprob": -0.15973405264042043, "compression_ratio": 1.8130563798219586, "no_speech_prob": 0.0014983630971983075}, {"id": 1875, "seek": 68324, "start": 6595.88, "end": 6597.88, "text": " This is one of those really important papers to look at", "tokens": [50996, 639, 307, 472, 295, 729, 534, 1021, 10577, 281, 574, 412, 51096], "temperature": 0.0, "avg_logprob": -0.15973405264042043, "compression_ratio": 1.8130563798219586, "no_speech_prob": 0.0014983630971983075}, {"id": 1876, "seek": 68324, "start": 6597.88, "end": 6601.16, "text": " if you care about how data flows through chips", "tokens": [51096, 498, 291, 1127, 466, 577, 1412, 12867, 807, 11583, 51260], "temperature": 0.0, "avg_logprob": -0.15973405264042043, "compression_ratio": 1.8130563798219586, "no_speech_prob": 0.0014983630971983075}, {"id": 1877, "seek": 68324, "start": 6601.16, "end": 6603.84, "text": " in a data center, for example, how, yeah, I mean,", "tokens": [51260, 294, 257, 1412, 3056, 11, 337, 1365, 11, 577, 11, 1338, 11, 286, 914, 11, 51394], "temperature": 0.0, "avg_logprob": -0.15973405264042043, "compression_ratio": 1.8130563798219586, "no_speech_prob": 0.0014983630971983075}, {"id": 1878, "seek": 68324, "start": 6603.84, "end": 6605.0, "text": " what scales and what doesn't.", "tokens": [51394, 437, 17408, 293, 437, 1177, 380, 13, 51452], "temperature": 0.0, "avg_logprob": -0.15973405264042043, "compression_ratio": 1.8130563798219586, "no_speech_prob": 0.0014983630971983075}, {"id": 1879, "seek": 68324, "start": 6605.0, "end": 6606.4400000000005, "text": " This is actually a really, really important", "tokens": [51452, 639, 307, 767, 257, 534, 11, 534, 1021, 51524], "temperature": 0.0, "avg_logprob": -0.15973405264042043, "compression_ratio": 1.8130563798219586, "no_speech_prob": 0.0014983630971983075}, {"id": 1880, "seek": 68324, "start": 6606.4400000000005, "end": 6607.84, "text": " and I think interesting paper.", "tokens": [51524, 293, 286, 519, 1880, 3035, 13, 51594], "temperature": 0.0, "avg_logprob": -0.15973405264042043, "compression_ratio": 1.8130563798219586, "no_speech_prob": 0.0014983630971983075}, {"id": 1881, "seek": 68324, "start": 6607.84, "end": 6609.28, "text": " I'll park it there, but hopefully that", "tokens": [51594, 286, 603, 3884, 309, 456, 11, 457, 4696, 300, 51666], "temperature": 0.0, "avg_logprob": -0.15973405264042043, "compression_ratio": 1.8130563798219586, "no_speech_prob": 0.0014983630971983075}, {"id": 1882, "seek": 68324, "start": 6609.28, "end": 6611.96, "text": " what's your appetite to check it out if that's your thing.", "tokens": [51666, 437, 311, 428, 23996, 281, 1520, 309, 484, 498, 300, 311, 428, 551, 13, 51800], "temperature": 0.0, "avg_logprob": -0.15973405264042043, "compression_ratio": 1.8130563798219586, "no_speech_prob": 0.0014983630971983075}, {"id": 1883, "seek": 71196, "start": 6611.96, "end": 6614.8, "text": " And finally, we're looking at the mamba three paper.", "tokens": [50364, 400, 2721, 11, 321, 434, 1237, 412, 264, 275, 23337, 1045, 3035, 13, 50506], "temperature": 0.0, "avg_logprob": -0.16404618200526308, "compression_ratio": 1.7587412587412588, "no_speech_prob": 0.0005286998930387199}, {"id": 1884, "seek": 71196, "start": 6614.8, "end": 6616.68, "text": " So we have mamba three, right?", "tokens": [50506, 407, 321, 362, 275, 23337, 1045, 11, 558, 30, 50600], "temperature": 0.0, "avg_logprob": -0.16404618200526308, "compression_ratio": 1.7587412587412588, "no_speech_prob": 0.0005286998930387199}, {"id": 1885, "seek": 71196, "start": 6616.68, "end": 6619.36, "text": " We were stuck on mamba two for a little while there.", "tokens": [50600, 492, 645, 5541, 322, 275, 23337, 732, 337, 257, 707, 1339, 456, 13, 50734], "temperature": 0.0, "avg_logprob": -0.16404618200526308, "compression_ratio": 1.7587412587412588, "no_speech_prob": 0.0005286998930387199}, {"id": 1886, "seek": 71196, "start": 6619.36, "end": 6621.08, "text": " So mamba three improved sequence modeling", "tokens": [50734, 407, 275, 23337, 1045, 9689, 8310, 15983, 50820], "temperature": 0.0, "avg_logprob": -0.16404618200526308, "compression_ratio": 1.7587412587412588, "no_speech_prob": 0.0005286998930387199}, {"id": 1887, "seek": 71196, "start": 6621.08, "end": 6623.16, "text": " using state space principles.", "tokens": [50820, 1228, 1785, 1901, 9156, 13, 50924], "temperature": 0.0, "avg_logprob": -0.16404618200526308, "compression_ratio": 1.7587412587412588, "no_speech_prob": 0.0005286998930387199}, {"id": 1888, "seek": 71196, "start": 6623.16, "end": 6625.0, "text": " State space principles.", "tokens": [50924, 4533, 1901, 9156, 13, 51016], "temperature": 0.0, "avg_logprob": -0.16404618200526308, "compression_ratio": 1.7587412587412588, "no_speech_prob": 0.0005286998930387199}, {"id": 1889, "seek": 71196, "start": 6625.0, "end": 6627.12, "text": " That word principle is actually really important.", "tokens": [51016, 663, 1349, 8665, 307, 767, 534, 1021, 13, 51122], "temperature": 0.0, "avg_logprob": -0.16404618200526308, "compression_ratio": 1.7587412587412588, "no_speech_prob": 0.0005286998930387199}, {"id": 1890, "seek": 71196, "start": 6627.12, "end": 6630.92, "text": " This, among other things, is an attempt to ground", "tokens": [51122, 639, 11, 3654, 661, 721, 11, 307, 364, 5217, 281, 2727, 51312], "temperature": 0.0, "avg_logprob": -0.16404618200526308, "compression_ratio": 1.7587412587412588, "no_speech_prob": 0.0005286998930387199}, {"id": 1891, "seek": 71196, "start": 6630.92, "end": 6635.0, "text": " the mamba approach in a more like theoretically", "tokens": [51312, 264, 275, 23337, 3109, 294, 257, 544, 411, 29400, 51516], "temperature": 0.0, "avg_logprob": -0.16404618200526308, "compression_ratio": 1.7587412587412588, "no_speech_prob": 0.0005286998930387199}, {"id": 1892, "seek": 71196, "start": 6635.0, "end": 6636.92, "text": " robust foundation.", "tokens": [51516, 13956, 7030, 13, 51612], "temperature": 0.0, "avg_logprob": -0.16404618200526308, "compression_ratio": 1.7587412587412588, "no_speech_prob": 0.0005286998930387199}, {"id": 1893, "seek": 71196, "start": 6636.92, "end": 6638.8, "text": " It'll be clear in a minute what I mean by that.", "tokens": [51612, 467, 603, 312, 1850, 294, 257, 3456, 437, 286, 914, 538, 300, 13, 51706], "temperature": 0.0, "avg_logprob": -0.16404618200526308, "compression_ratio": 1.7587412587412588, "no_speech_prob": 0.0005286998930387199}, {"id": 1894, "seek": 71196, "start": 6638.8, "end": 6641.04, "text": " But the way to think about the mamba papers in general,", "tokens": [51706, 583, 264, 636, 281, 519, 466, 264, 275, 23337, 10577, 294, 2674, 11, 51818], "temperature": 0.0, "avg_logprob": -0.16404618200526308, "compression_ratio": 1.7587412587412588, "no_speech_prob": 0.0005286998930387199}, {"id": 1895, "seek": 74104, "start": 6641.04, "end": 6644.04, "text": " they are dense, they are hardware aware,", "tokens": [50364, 436, 366, 18011, 11, 436, 366, 8837, 3650, 11, 50514], "temperature": 0.0, "avg_logprob": -0.13797982787409574, "compression_ratio": 1.7966101694915255, "no_speech_prob": 0.0013972708256915212}, {"id": 1896, "seek": 74104, "start": 6644.04, "end": 6645.8, "text": " which is always, I mean, I find it fun,", "tokens": [50514, 597, 307, 1009, 11, 286, 914, 11, 286, 915, 309, 1019, 11, 50602], "temperature": 0.0, "avg_logprob": -0.13797982787409574, "compression_ratio": 1.7966101694915255, "no_speech_prob": 0.0013972708256915212}, {"id": 1897, "seek": 74104, "start": 6645.8, "end": 6647.96, "text": " but it means that there's a lot of complexity", "tokens": [50602, 457, 309, 1355, 300, 456, 311, 257, 688, 295, 14024, 50710], "temperature": 0.0, "avg_logprob": -0.13797982787409574, "compression_ratio": 1.7966101694915255, "no_speech_prob": 0.0013972708256915212}, {"id": 1898, "seek": 74104, "start": 6647.96, "end": 6649.0, "text": " and mathematical complexity.", "tokens": [50710, 293, 18894, 14024, 13, 50762], "temperature": 0.0, "avg_logprob": -0.13797982787409574, "compression_ratio": 1.7966101694915255, "no_speech_prob": 0.0013972708256915212}, {"id": 1899, "seek": 74104, "start": 6649.0, "end": 6652.6, "text": " This one is a lot of kind of integral calculus", "tokens": [50762, 639, 472, 307, 257, 688, 295, 733, 295, 11573, 33400, 50942], "temperature": 0.0, "avg_logprob": -0.13797982787409574, "compression_ratio": 1.7966101694915255, "no_speech_prob": 0.0013972708256915212}, {"id": 1900, "seek": 74104, "start": 6652.6, "end": 6655.84, "text": " and finding kind of principled ways to represent", "tokens": [50942, 293, 5006, 733, 295, 3681, 15551, 2098, 281, 2906, 51104], "temperature": 0.0, "avg_logprob": -0.13797982787409574, "compression_ratio": 1.7966101694915255, "no_speech_prob": 0.0013972708256915212}, {"id": 1901, "seek": 74104, "start": 6655.84, "end": 6658.68, "text": " state transformations that sort of like reflect", "tokens": [51104, 1785, 34852, 300, 1333, 295, 411, 5031, 51246], "temperature": 0.0, "avg_logprob": -0.13797982787409574, "compression_ratio": 1.7966101694915255, "no_speech_prob": 0.0013972708256915212}, {"id": 1902, "seek": 74104, "start": 6658.68, "end": 6661.08, "text": " in a way that the physics of how information", "tokens": [51246, 294, 257, 636, 300, 264, 10649, 295, 577, 1589, 51366], "temperature": 0.0, "avg_logprob": -0.13797982787409574, "compression_ratio": 1.7966101694915255, "no_speech_prob": 0.0013972708256915212}, {"id": 1903, "seek": 74104, "start": 6661.08, "end": 6661.92, "text": " should evolve in the system.", "tokens": [51366, 820, 16693, 294, 264, 1185, 13, 51408], "temperature": 0.0, "avg_logprob": -0.13797982787409574, "compression_ratio": 1.7966101694915255, "no_speech_prob": 0.0013972708256915212}, {"id": 1904, "seek": 74104, "start": 6661.92, "end": 6663.72, "text": " So I'm going to get more concrete now", "tokens": [51408, 407, 286, 478, 516, 281, 483, 544, 9859, 586, 51498], "temperature": 0.0, "avg_logprob": -0.13797982787409574, "compression_ratio": 1.7966101694915255, "no_speech_prob": 0.0013972708256915212}, {"id": 1905, "seek": 74104, "start": 6663.72, "end": 6665.24, "text": " because that was kind of kind of big.", "tokens": [51498, 570, 300, 390, 733, 295, 733, 295, 955, 13, 51574], "temperature": 0.0, "avg_logprob": -0.13797982787409574, "compression_ratio": 1.7966101694915255, "no_speech_prob": 0.0013972708256915212}, {"id": 1906, "seek": 74104, "start": 6665.24, "end": 6668.4, "text": " So when you think about a state space model, right?", "tokens": [51574, 407, 562, 291, 519, 466, 257, 1785, 1901, 2316, 11, 558, 30, 51732], "temperature": 0.0, "avg_logprob": -0.13797982787409574, "compression_ratio": 1.7966101694915255, "no_speech_prob": 0.0013972708256915212}, {"id": 1907, "seek": 74104, "start": 6668.4, "end": 6669.92, "text": " What is a state space model?", "tokens": [51732, 708, 307, 257, 1785, 1901, 2316, 30, 51808], "temperature": 0.0, "avg_logprob": -0.13797982787409574, "compression_ratio": 1.7966101694915255, "no_speech_prob": 0.0013972708256915212}, {"id": 1908, "seek": 76992, "start": 6669.92, "end": 6672.88, "text": " I mean, to caricature it is a vector, right?", "tokens": [50364, 286, 914, 11, 281, 45732, 1503, 309, 307, 257, 8062, 11, 558, 30, 50512], "temperature": 0.0, "avg_logprob": -0.15365231656111203, "compression_ratio": 1.7943548387096775, "no_speech_prob": 0.0008832134190015495}, {"id": 1909, "seek": 76992, "start": 6672.88, "end": 6674.360000000001, "text": " So it's a list of numbers.", "tokens": [50512, 407, 309, 311, 257, 1329, 295, 3547, 13, 50586], "temperature": 0.0, "avg_logprob": -0.15365231656111203, "compression_ratio": 1.7943548387096775, "no_speech_prob": 0.0008832134190015495}, {"id": 1910, "seek": 76992, "start": 6674.360000000001, "end": 6678.6, "text": " And as your model scans over a sequence,", "tokens": [50586, 400, 382, 428, 2316, 35116, 670, 257, 8310, 11, 50798], "temperature": 0.0, "avg_logprob": -0.15365231656111203, "compression_ratio": 1.7943548387096775, "no_speech_prob": 0.0008832134190015495}, {"id": 1911, "seek": 76992, "start": 6678.6, "end": 6682.4, "text": " say a sequence of text, you're going to evolve", "tokens": [50798, 584, 257, 8310, 295, 2487, 11, 291, 434, 516, 281, 16693, 50988], "temperature": 0.0, "avg_logprob": -0.15365231656111203, "compression_ratio": 1.7943548387096775, "no_speech_prob": 0.0008832134190015495}, {"id": 1912, "seek": 76992, "start": 6682.4, "end": 6685.4400000000005, "text": " the values in that vector, in that list of numbers.", "tokens": [50988, 264, 4190, 294, 300, 8062, 11, 294, 300, 1329, 295, 3547, 13, 51140], "temperature": 0.0, "avg_logprob": -0.15365231656111203, "compression_ratio": 1.7943548387096775, "no_speech_prob": 0.0008832134190015495}, {"id": 1913, "seek": 76992, "start": 6685.4400000000005, "end": 6687.88, "text": " And those values are going to represent,", "tokens": [51140, 400, 729, 4190, 366, 516, 281, 2906, 11, 51262], "temperature": 0.0, "avg_logprob": -0.15365231656111203, "compression_ratio": 1.7943548387096775, "no_speech_prob": 0.0008832134190015495}, {"id": 1914, "seek": 76992, "start": 6687.88, "end": 6690.16, "text": " capture the meaning of what you have read,", "tokens": [51262, 7983, 264, 3620, 295, 437, 291, 362, 1401, 11, 51376], "temperature": 0.0, "avg_logprob": -0.15365231656111203, "compression_ratio": 1.7943548387096775, "no_speech_prob": 0.0008832134190015495}, {"id": 1915, "seek": 76992, "start": 6690.16, "end": 6693.52, "text": " of what you've scanned over, or what the model scanned over.", "tokens": [51376, 295, 437, 291, 600, 45089, 670, 11, 420, 437, 264, 2316, 45089, 670, 13, 51544], "temperature": 0.0, "avg_logprob": -0.15365231656111203, "compression_ratio": 1.7943548387096775, "no_speech_prob": 0.0008832134190015495}, {"id": 1916, "seek": 76992, "start": 6693.52, "end": 6695.12, "text": " Okay, so now there's this question of like,", "tokens": [51544, 1033, 11, 370, 586, 456, 311, 341, 1168, 295, 411, 11, 51624], "temperature": 0.0, "avg_logprob": -0.15365231656111203, "compression_ratio": 1.7943548387096775, "no_speech_prob": 0.0008832134190015495}, {"id": 1917, "seek": 76992, "start": 6695.12, "end": 6697.360000000001, "text": " all right, well, if that's the general gist,", "tokens": [51624, 439, 558, 11, 731, 11, 498, 300, 311, 264, 2674, 290, 468, 11, 51736], "temperature": 0.0, "avg_logprob": -0.15365231656111203, "compression_ratio": 1.7943548387096775, "no_speech_prob": 0.0008832134190015495}, {"id": 1918, "seek": 79736, "start": 6697.36, "end": 6699.88, "text": " we need some kind of uptake rule, right,", "tokens": [50364, 321, 643, 512, 733, 295, 493, 27612, 4978, 11, 558, 11, 50490], "temperature": 0.0, "avg_logprob": -0.14927221823782816, "compression_ratio": 1.7096774193548387, "no_speech_prob": 0.0008955675875768065}, {"id": 1919, "seek": 79736, "start": 6699.88, "end": 6702.6, "text": " for that vector for that list of numbers.", "tokens": [50490, 337, 300, 8062, 337, 300, 1329, 295, 3547, 13, 50626], "temperature": 0.0, "avg_logprob": -0.14927221823782816, "compression_ratio": 1.7096774193548387, "no_speech_prob": 0.0008955675875768065}, {"id": 1920, "seek": 79736, "start": 6702.6, "end": 6704.72, "text": " And well, if we look back at physics,", "tokens": [50626, 400, 731, 11, 498, 321, 574, 646, 412, 10649, 11, 50732], "temperature": 0.0, "avg_logprob": -0.14927221823782816, "compression_ratio": 1.7096774193548387, "no_speech_prob": 0.0008955675875768065}, {"id": 1921, "seek": 79736, "start": 6704.72, "end": 6707.72, "text": " if we look at how we think about describing, you know,", "tokens": [50732, 498, 321, 574, 412, 577, 321, 519, 466, 16141, 11, 291, 458, 11, 50882], "temperature": 0.0, "avg_logprob": -0.14927221823782816, "compression_ratio": 1.7096774193548387, "no_speech_prob": 0.0008955675875768065}, {"id": 1922, "seek": 79736, "start": 6707.72, "end": 6710.0, "text": " like a pendulum or an electrical circuit", "tokens": [50882, 411, 257, 44103, 420, 364, 12147, 9048, 50996], "temperature": 0.0, "avg_logprob": -0.14927221823782816, "compression_ratio": 1.7096774193548387, "no_speech_prob": 0.0008955675875768065}, {"id": 1923, "seek": 79736, "start": 6710.0, "end": 6711.56, "text": " or water flowing through pipes,", "tokens": [50996, 420, 1281, 13974, 807, 21882, 11, 51074], "temperature": 0.0, "avg_logprob": -0.14927221823782816, "compression_ratio": 1.7096774193548387, "no_speech_prob": 0.0008955675875768065}, {"id": 1924, "seek": 79736, "start": 6711.56, "end": 6713.68, "text": " or you know, these sorts of problems,", "tokens": [51074, 420, 291, 458, 11, 613, 7527, 295, 2740, 11, 51180], "temperature": 0.0, "avg_logprob": -0.14927221823782816, "compression_ratio": 1.7096774193548387, "no_speech_prob": 0.0008955675875768065}, {"id": 1925, "seek": 79736, "start": 6713.68, "end": 6716.72, "text": " like what is the form, the kind of equation", "tokens": [51180, 411, 437, 307, 264, 1254, 11, 264, 733, 295, 5367, 51332], "temperature": 0.0, "avg_logprob": -0.14927221823782816, "compression_ratio": 1.7096774193548387, "no_speech_prob": 0.0008955675875768065}, {"id": 1926, "seek": 79736, "start": 6716.72, "end": 6719.68, "text": " that you use to govern that dynamics, right?", "tokens": [51332, 300, 291, 764, 281, 1980, 300, 15679, 11, 558, 30, 51480], "temperature": 0.0, "avg_logprob": -0.14927221823782816, "compression_ratio": 1.7096774193548387, "no_speech_prob": 0.0008955675875768065}, {"id": 1927, "seek": 79736, "start": 6719.68, "end": 6723.24, "text": " Well, you'll typically have a state, right?", "tokens": [51480, 1042, 11, 291, 603, 5850, 362, 257, 1785, 11, 558, 30, 51658], "temperature": 0.0, "avg_logprob": -0.14927221823782816, "compression_ratio": 1.7096774193548387, "no_speech_prob": 0.0008955675875768065}, {"id": 1928, "seek": 79736, "start": 6723.24, "end": 6727.2, "text": " So in this case, H of T, think of that as like the state.", "tokens": [51658, 407, 294, 341, 1389, 11, 389, 295, 314, 11, 519, 295, 300, 382, 411, 264, 1785, 13, 51856], "temperature": 0.0, "avg_logprob": -0.14927221823782816, "compression_ratio": 1.7096774193548387, "no_speech_prob": 0.0008955675875768065}, {"id": 1929, "seek": 82720, "start": 6727.2, "end": 6730.64, "text": " And the rate of change, so the derivative,", "tokens": [50364, 400, 264, 3314, 295, 1319, 11, 370, 264, 13760, 11, 50536], "temperature": 0.0, "avg_logprob": -0.16817440581126292, "compression_ratio": 1.794871794871795, "no_speech_prob": 0.0007032835274003446}, {"id": 1930, "seek": 82720, "start": 6730.64, "end": 6731.96, "text": " in mathematical terms, but like,", "tokens": [50536, 294, 18894, 2115, 11, 457, 411, 11, 50602], "temperature": 0.0, "avg_logprob": -0.16817440581126292, "compression_ratio": 1.794871794871795, "no_speech_prob": 0.0007032835274003446}, {"id": 1931, "seek": 82720, "start": 6731.96, "end": 6735.2, "text": " basically how that state changes over time", "tokens": [50602, 1936, 577, 300, 1785, 2962, 670, 565, 50764], "temperature": 0.0, "avg_logprob": -0.16817440581126292, "compression_ratio": 1.794871794871795, "no_speech_prob": 0.0007032835274003446}, {"id": 1932, "seek": 82720, "start": 6735.2, "end": 6739.4, "text": " is gonna be equal to that state,", "tokens": [50764, 307, 799, 312, 2681, 281, 300, 1785, 11, 50974], "temperature": 0.0, "avg_logprob": -0.16817440581126292, "compression_ratio": 1.794871794871795, "no_speech_prob": 0.0007032835274003446}, {"id": 1933, "seek": 82720, "start": 6739.4, "end": 6741.2, "text": " maybe modified in some way.", "tokens": [50974, 1310, 15873, 294, 512, 636, 13, 51064], "temperature": 0.0, "avg_logprob": -0.16817440581126292, "compression_ratio": 1.794871794871795, "no_speech_prob": 0.0007032835274003446}, {"id": 1934, "seek": 82720, "start": 6741.2, "end": 6744.68, "text": " So, but in all that means is the evolution of your state", "tokens": [51064, 407, 11, 457, 294, 439, 300, 1355, 307, 264, 9303, 295, 428, 1785, 51238], "temperature": 0.0, "avg_logprob": -0.16817440581126292, "compression_ratio": 1.794871794871795, "no_speech_prob": 0.0007032835274003446}, {"id": 1935, "seek": 82720, "start": 6744.68, "end": 6746.68, "text": " is a function of the state.", "tokens": [51238, 307, 257, 2445, 295, 264, 1785, 13, 51338], "temperature": 0.0, "avg_logprob": -0.16817440581126292, "compression_ratio": 1.794871794871795, "no_speech_prob": 0.0007032835274003446}, {"id": 1936, "seek": 82720, "start": 6746.68, "end": 6749.6, "text": " So where you are going to be in a minute", "tokens": [51338, 407, 689, 291, 366, 516, 281, 312, 294, 257, 3456, 51484], "temperature": 0.0, "avg_logprob": -0.16817440581126292, "compression_ratio": 1.794871794871795, "no_speech_prob": 0.0007032835274003446}, {"id": 1937, "seek": 82720, "start": 6749.6, "end": 6751.92, "text": " is a function of where you are right now.", "tokens": [51484, 307, 257, 2445, 295, 689, 291, 366, 558, 586, 13, 51600], "temperature": 0.0, "avg_logprob": -0.16817440581126292, "compression_ratio": 1.794871794871795, "no_speech_prob": 0.0007032835274003446}, {"id": 1938, "seek": 82720, "start": 6751.92, "end": 6753.2, "text": " And this is pretty intuitive.", "tokens": [51600, 400, 341, 307, 1238, 21769, 13, 51664], "temperature": 0.0, "avg_logprob": -0.16817440581126292, "compression_ratio": 1.794871794871795, "no_speech_prob": 0.0007032835274003446}, {"id": 1939, "seek": 82720, "start": 6753.2, "end": 6755.76, "text": " I mean, like if you, you know, if you see,", "tokens": [51664, 286, 914, 11, 411, 498, 291, 11, 291, 458, 11, 498, 291, 536, 11, 51792], "temperature": 0.0, "avg_logprob": -0.16817440581126292, "compression_ratio": 1.794871794871795, "no_speech_prob": 0.0007032835274003446}, {"id": 1940, "seek": 85576, "start": 6755.76, "end": 6758.48, "text": " what are like a coyote or road runner suspended", "tokens": [50364, 437, 366, 411, 257, 41485, 1370, 420, 3060, 24376, 23437, 50500], "temperature": 0.0, "avg_logprob": -0.18266571671874435, "compression_ratio": 1.8178438661710037, "no_speech_prob": 0.000985326711088419}, {"id": 1941, "seek": 85576, "start": 6758.48, "end": 6761.8, "text": " in mid-air with no, you know, nothing underneath them", "tokens": [50500, 294, 2062, 12, 1246, 365, 572, 11, 291, 458, 11, 1825, 7223, 552, 50666], "temperature": 0.0, "avg_logprob": -0.18266571671874435, "compression_ratio": 1.8178438661710037, "no_speech_prob": 0.000985326711088419}, {"id": 1942, "seek": 85576, "start": 6761.8, "end": 6763.96, "text": " to hold them up, like, yeah, he's gonna fall.", "tokens": [50666, 281, 1797, 552, 493, 11, 411, 11, 1338, 11, 415, 311, 799, 2100, 13, 50774], "temperature": 0.0, "avg_logprob": -0.18266571671874435, "compression_ratio": 1.8178438661710037, "no_speech_prob": 0.000985326711088419}, {"id": 1943, "seek": 85576, "start": 6763.96, "end": 6765.72, "text": " And the fact that he's falling is a function", "tokens": [50774, 400, 264, 1186, 300, 415, 311, 7440, 307, 257, 2445, 50862], "temperature": 0.0, "avg_logprob": -0.18266571671874435, "compression_ratio": 1.8178438661710037, "no_speech_prob": 0.000985326711088419}, {"id": 1944, "seek": 85576, "start": 6765.72, "end": 6766.96, "text": " of where he was before, right?", "tokens": [50862, 295, 689, 415, 390, 949, 11, 558, 30, 50924], "temperature": 0.0, "avg_logprob": -0.18266571671874435, "compression_ratio": 1.8178438661710037, "no_speech_prob": 0.000985326711088419}, {"id": 1945, "seek": 85576, "start": 6766.96, "end": 6769.84, "text": " So in this sense, you know, the rate of change", "tokens": [50924, 407, 294, 341, 2020, 11, 291, 458, 11, 264, 3314, 295, 1319, 51068], "temperature": 0.0, "avg_logprob": -0.18266571671874435, "compression_ratio": 1.8178438661710037, "no_speech_prob": 0.000985326711088419}, {"id": 1946, "seek": 85576, "start": 6769.84, "end": 6773.5599999999995, "text": " in that state or the future state of that system", "tokens": [51068, 294, 300, 1785, 420, 264, 2027, 1785, 295, 300, 1185, 51254], "temperature": 0.0, "avg_logprob": -0.18266571671874435, "compression_ratio": 1.8178438661710037, "no_speech_prob": 0.000985326711088419}, {"id": 1947, "seek": 85576, "start": 6773.5599999999995, "end": 6775.88, "text": " is a function of the state.", "tokens": [51254, 307, 257, 2445, 295, 264, 1785, 13, 51370], "temperature": 0.0, "avg_logprob": -0.18266571671874435, "compression_ratio": 1.8178438661710037, "no_speech_prob": 0.000985326711088419}, {"id": 1948, "seek": 85576, "start": 6775.88, "end": 6779.16, "text": " Time some multiplying factor that does a function of time,", "tokens": [51370, 6161, 512, 30955, 5952, 300, 775, 257, 2445, 295, 565, 11, 51534], "temperature": 0.0, "avg_logprob": -0.18266571671874435, "compression_ratio": 1.8178438661710037, "no_speech_prob": 0.000985326711088419}, {"id": 1949, "seek": 85576, "start": 6779.16, "end": 6782.84, "text": " whatever, and then plus some additional function", "tokens": [51534, 2035, 11, 293, 550, 1804, 512, 4497, 2445, 51718], "temperature": 0.0, "avg_logprob": -0.18266571671874435, "compression_ratio": 1.8178438661710037, "no_speech_prob": 0.000985326711088419}, {"id": 1950, "seek": 85576, "start": 6782.84, "end": 6784.5599999999995, "text": " of like the inputs to the system,", "tokens": [51718, 295, 411, 264, 15743, 281, 264, 1185, 11, 51804], "temperature": 0.0, "avg_logprob": -0.18266571671874435, "compression_ratio": 1.8178438661710037, "no_speech_prob": 0.000985326711088419}, {"id": 1951, "seek": 88456, "start": 6784.56, "end": 6787.16, "text": " the current state, the current input to the system.", "tokens": [50364, 264, 2190, 1785, 11, 264, 2190, 4846, 281, 264, 1185, 13, 50494], "temperature": 0.0, "avg_logprob": -0.11961750412250266, "compression_ratio": 1.9003831417624522, "no_speech_prob": 0.002097361022606492}, {"id": 1952, "seek": 88456, "start": 6787.16, "end": 6789.8, "text": " So the rate of change of the hidden state", "tokens": [50494, 407, 264, 3314, 295, 1319, 295, 264, 7633, 1785, 50626], "temperature": 0.0, "avg_logprob": -0.11961750412250266, "compression_ratio": 1.9003831417624522, "no_speech_prob": 0.002097361022606492}, {"id": 1953, "seek": 88456, "start": 6789.8, "end": 6793.12, "text": " in a state space model is gonna be determined", "tokens": [50626, 294, 257, 1785, 1901, 2316, 307, 799, 312, 9540, 50792], "temperature": 0.0, "avg_logprob": -0.11961750412250266, "compression_ratio": 1.9003831417624522, "no_speech_prob": 0.002097361022606492}, {"id": 1954, "seek": 88456, "start": 6793.12, "end": 6794.8, "text": " by the current state of the model", "tokens": [50792, 538, 264, 2190, 1785, 295, 264, 2316, 50876], "temperature": 0.0, "avg_logprob": -0.11961750412250266, "compression_ratio": 1.9003831417624522, "no_speech_prob": 0.002097361022606492}, {"id": 1955, "seek": 88456, "start": 6794.8, "end": 6796.92, "text": " and its current input.", "tokens": [50876, 293, 1080, 2190, 4846, 13, 50982], "temperature": 0.0, "avg_logprob": -0.11961750412250266, "compression_ratio": 1.9003831417624522, "no_speech_prob": 0.002097361022606492}, {"id": 1956, "seek": 88456, "start": 6796.92, "end": 6799.4400000000005, "text": " So the thing that in a sense perturbs that state,", "tokens": [50982, 407, 264, 551, 300, 294, 257, 2020, 13269, 374, 929, 300, 1785, 11, 51108], "temperature": 0.0, "avg_logprob": -0.11961750412250266, "compression_ratio": 1.9003831417624522, "no_speech_prob": 0.002097361022606492}, {"id": 1957, "seek": 88456, "start": 6799.4400000000005, "end": 6801.4, "text": " the new piece of information that you're seeing.", "tokens": [51108, 264, 777, 2522, 295, 1589, 300, 291, 434, 2577, 13, 51206], "temperature": 0.0, "avg_logprob": -0.11961750412250266, "compression_ratio": 1.9003831417624522, "no_speech_prob": 0.002097361022606492}, {"id": 1958, "seek": 88456, "start": 6801.4, "end": 6805.24, "text": " So my next state space vector is going to be a function", "tokens": [51206, 407, 452, 958, 1785, 1901, 8062, 307, 516, 281, 312, 257, 2445, 51398], "temperature": 0.0, "avg_logprob": -0.11961750412250266, "compression_ratio": 1.9003831417624522, "no_speech_prob": 0.002097361022606492}, {"id": 1959, "seek": 88456, "start": 6805.24, "end": 6806.28, "text": " of what I've read so far,", "tokens": [51398, 295, 437, 286, 600, 1401, 370, 1400, 11, 51450], "temperature": 0.0, "avg_logprob": -0.11961750412250266, "compression_ratio": 1.9003831417624522, "no_speech_prob": 0.002097361022606492}, {"id": 1960, "seek": 88456, "start": 6806.28, "end": 6809.6, "text": " plus some modified form of the next token", "tokens": [51450, 1804, 512, 15873, 1254, 295, 264, 958, 14862, 51616], "temperature": 0.0, "avg_logprob": -0.11961750412250266, "compression_ratio": 1.9003831417624522, "no_speech_prob": 0.002097361022606492}, {"id": 1961, "seek": 88456, "start": 6809.6, "end": 6810.4400000000005, "text": " that I'm reading, right?", "tokens": [51616, 300, 286, 478, 3760, 11, 558, 30, 51658], "temperature": 0.0, "avg_logprob": -0.11961750412250266, "compression_ratio": 1.9003831417624522, "no_speech_prob": 0.002097361022606492}, {"id": 1962, "seek": 88456, "start": 6810.4400000000005, "end": 6812.52, "text": " This is all kind of trying to build that intuition.", "tokens": [51658, 639, 307, 439, 733, 295, 1382, 281, 1322, 300, 24002, 13, 51762], "temperature": 0.0, "avg_logprob": -0.11961750412250266, "compression_ratio": 1.9003831417624522, "no_speech_prob": 0.002097361022606492}, {"id": 1963, "seek": 91252, "start": 6812.52, "end": 6814.92, "text": " Now that's, that would be true,", "tokens": [50364, 823, 300, 311, 11, 300, 576, 312, 2074, 11, 50484], "temperature": 0.0, "avg_logprob": -0.14436388649046422, "compression_ratio": 1.6875, "no_speech_prob": 0.00047538423677906394}, {"id": 1964, "seek": 91252, "start": 6814.92, "end": 6817.2, "text": " or you can describe that mathematically with a derivative", "tokens": [50484, 420, 291, 393, 6786, 300, 44003, 365, 257, 13760, 50598], "temperature": 0.0, "avg_logprob": -0.14436388649046422, "compression_ratio": 1.6875, "no_speech_prob": 0.00047538423677906394}, {"id": 1965, "seek": 91252, "start": 6817.2, "end": 6819.12, "text": " with that idea of like the evolution of a system", "tokens": [50598, 365, 300, 1558, 295, 411, 264, 9303, 295, 257, 1185, 50694], "temperature": 0.0, "avg_logprob": -0.14436388649046422, "compression_ratio": 1.6875, "no_speech_prob": 0.00047538423677906394}, {"id": 1966, "seek": 91252, "start": 6819.12, "end": 6823.36, "text": " over time, smoothly, if you're working with time,", "tokens": [50694, 670, 565, 11, 19565, 11, 498, 291, 434, 1364, 365, 565, 11, 50906], "temperature": 0.0, "avg_logprob": -0.14436388649046422, "compression_ratio": 1.6875, "no_speech_prob": 0.00047538423677906394}, {"id": 1967, "seek": 91252, "start": 6823.36, "end": 6825.28, "text": " which is a continuous variable.", "tokens": [50906, 597, 307, 257, 10957, 7006, 13, 51002], "temperature": 0.0, "avg_logprob": -0.14436388649046422, "compression_ratio": 1.6875, "no_speech_prob": 0.00047538423677906394}, {"id": 1968, "seek": 91252, "start": 6825.28, "end": 6827.92, "text": " But the math kind of becomes harder.", "tokens": [51002, 583, 264, 5221, 733, 295, 3643, 6081, 13, 51134], "temperature": 0.0, "avg_logprob": -0.14436388649046422, "compression_ratio": 1.6875, "no_speech_prob": 0.00047538423677906394}, {"id": 1969, "seek": 91252, "start": 6827.92, "end": 6829.24, "text": " I won't take quite breaks down,", "tokens": [51134, 286, 1582, 380, 747, 1596, 9857, 760, 11, 51200], "temperature": 0.0, "avg_logprob": -0.14436388649046422, "compression_ratio": 1.6875, "no_speech_prob": 0.00047538423677906394}, {"id": 1970, "seek": 91252, "start": 6829.24, "end": 6832.28, "text": " but it becomes harder when we move into language.", "tokens": [51200, 457, 309, 3643, 6081, 562, 321, 1286, 666, 2856, 13, 51352], "temperature": 0.0, "avg_logprob": -0.14436388649046422, "compression_ratio": 1.6875, "no_speech_prob": 0.00047538423677906394}, {"id": 1971, "seek": 91252, "start": 6832.28, "end": 6834.4, "text": " Because language models, they don't receive", "tokens": [51352, 1436, 2856, 5245, 11, 436, 500, 380, 4774, 51458], "temperature": 0.0, "avg_logprob": -0.14436388649046422, "compression_ratio": 1.6875, "no_speech_prob": 0.00047538423677906394}, {"id": 1972, "seek": 91252, "start": 6834.4, "end": 6836.92, "text": " a continuous stream of input.", "tokens": [51458, 257, 10957, 4309, 295, 4846, 13, 51584], "temperature": 0.0, "avg_logprob": -0.14436388649046422, "compression_ratio": 1.6875, "no_speech_prob": 0.00047538423677906394}, {"id": 1973, "seek": 91252, "start": 6836.92, "end": 6841.52, "text": " You can't model them as flowing through time.", "tokens": [51584, 509, 393, 380, 2316, 552, 382, 13974, 807, 565, 13, 51814], "temperature": 0.0, "avg_logprob": -0.14436388649046422, "compression_ratio": 1.6875, "no_speech_prob": 0.00047538423677906394}, {"id": 1974, "seek": 94152, "start": 6841.52, "end": 6843.4, "text": " Instead, they receive these discrete tokens,", "tokens": [50364, 7156, 11, 436, 4774, 613, 27706, 22667, 11, 50458], "temperature": 0.0, "avg_logprob": -0.11770677158975193, "compression_ratio": 1.7230769230769232, "no_speech_prob": 0.001272436580620706}, {"id": 1975, "seek": 94152, "start": 6843.4, "end": 6845.4, "text": " like word one, word two, word three.", "tokens": [50458, 411, 1349, 472, 11, 1349, 732, 11, 1349, 1045, 13, 50558], "temperature": 0.0, "avg_logprob": -0.11770677158975193, "compression_ratio": 1.7230769230769232, "no_speech_prob": 0.001272436580620706}, {"id": 1976, "seek": 94152, "start": 6845.4, "end": 6849.6, "text": " There's no in between tokens two and three, for example, right?", "tokens": [50558, 821, 311, 572, 294, 1296, 22667, 732, 293, 1045, 11, 337, 1365, 11, 558, 30, 50768], "temperature": 0.0, "avg_logprob": -0.11770677158975193, "compression_ratio": 1.7230769230769232, "no_speech_prob": 0.001272436580620706}, {"id": 1977, "seek": 94152, "start": 6849.6, "end": 6852.96, "text": " So you have to mathematically find a way to convert", "tokens": [50768, 407, 291, 362, 281, 44003, 915, 257, 636, 281, 7620, 50936], "temperature": 0.0, "avg_logprob": -0.11770677158975193, "compression_ratio": 1.7230769230769232, "no_speech_prob": 0.001272436580620706}, {"id": 1978, "seek": 94152, "start": 6852.96, "end": 6855.32, "text": " this continuous time equation", "tokens": [50936, 341, 10957, 565, 5367, 51054], "temperature": 0.0, "avg_logprob": -0.11770677158975193, "compression_ratio": 1.7230769230769232, "no_speech_prob": 0.001272436580620706}, {"id": 1979, "seek": 94152, "start": 6855.32, "end": 6859.04, "text": " into a discrete recurrence equation.", "tokens": [51054, 666, 257, 27706, 18680, 10760, 5367, 13, 51240], "temperature": 0.0, "avg_logprob": -0.11770677158975193, "compression_ratio": 1.7230769230769232, "no_speech_prob": 0.001272436580620706}, {"id": 1980, "seek": 94152, "start": 6859.04, "end": 6861.28, "text": " And that's gonna generally look similar,", "tokens": [51240, 400, 300, 311, 799, 5101, 574, 2531, 11, 51352], "temperature": 0.0, "avg_logprob": -0.11770677158975193, "compression_ratio": 1.7230769230769232, "no_speech_prob": 0.001272436580620706}, {"id": 1981, "seek": 94152, "start": 6861.28, "end": 6863.5599999999995, "text": " like you're gonna have some sort of new state", "tokens": [51352, 411, 291, 434, 799, 362, 512, 1333, 295, 777, 1785, 51466], "temperature": 0.0, "avg_logprob": -0.11770677158975193, "compression_ratio": 1.7230769230769232, "no_speech_prob": 0.001272436580620706}, {"id": 1982, "seek": 94152, "start": 6863.5599999999995, "end": 6865.76, "text": " that has to be a function of the old state,", "tokens": [51466, 300, 575, 281, 312, 257, 2445, 295, 264, 1331, 1785, 11, 51576], "temperature": 0.0, "avg_logprob": -0.11770677158975193, "compression_ratio": 1.7230769230769232, "no_speech_prob": 0.001272436580620706}, {"id": 1983, "seek": 94152, "start": 6865.76, "end": 6868.76, "text": " plus a function of the input, the most recent input.", "tokens": [51576, 1804, 257, 2445, 295, 264, 4846, 11, 264, 881, 5162, 4846, 13, 51726], "temperature": 0.0, "avg_logprob": -0.11770677158975193, "compression_ratio": 1.7230769230769232, "no_speech_prob": 0.001272436580620706}, {"id": 1984, "seek": 96876, "start": 6868.84, "end": 6871.84, "text": " And that conversion process is called discretization, right?", "tokens": [50368, 400, 300, 14298, 1399, 307, 1219, 25656, 2144, 11, 558, 30, 50518], "temperature": 0.0, "avg_logprob": -0.18882677835576675, "compression_ratio": 1.6870229007633588, "no_speech_prob": 0.004160813521593809}, {"id": 1985, "seek": 96876, "start": 6871.84, "end": 6873.0, "text": " So it's a very common thing.", "tokens": [50518, 407, 309, 311, 257, 588, 2689, 551, 13, 50576], "temperature": 0.0, "avg_logprob": -0.18882677835576675, "compression_ratio": 1.6870229007633588, "no_speech_prob": 0.004160813521593809}, {"id": 1986, "seek": 96876, "start": 6873.0, "end": 6876.72, "text": " You see it in a lot of context, in quantum physics,", "tokens": [50576, 509, 536, 309, 294, 257, 688, 295, 4319, 11, 294, 13018, 10649, 11, 50762], "temperature": 0.0, "avg_logprob": -0.18882677835576675, "compression_ratio": 1.6870229007633588, "no_speech_prob": 0.004160813521593809}, {"id": 1987, "seek": 96876, "start": 6876.72, "end": 6878.88, "text": " sometimes there's a variant of this", "tokens": [50762, 2171, 456, 311, 257, 17501, 295, 341, 50870], "temperature": 0.0, "avg_logprob": -0.18882677835576675, "compression_ratio": 1.6870229007633588, "no_speech_prob": 0.004160813521593809}, {"id": 1988, "seek": 96876, "start": 6878.88, "end": 6881.32, "text": " that you think that's sometimes called quantization,", "tokens": [50870, 300, 291, 519, 300, 311, 2171, 1219, 4426, 2144, 11, 50992], "temperature": 0.0, "avg_logprob": -0.18882677835576675, "compression_ratio": 1.6870229007633588, "no_speech_prob": 0.004160813521593809}, {"id": 1989, "seek": 96876, "start": 6881.32, "end": 6882.88, "text": " but this kind of thing happens a lot", "tokens": [50992, 457, 341, 733, 295, 551, 2314, 257, 688, 51070], "temperature": 0.0, "avg_logprob": -0.18882677835576675, "compression_ratio": 1.6870229007633588, "no_speech_prob": 0.004160813521593809}, {"id": 1990, "seek": 96876, "start": 6882.88, "end": 6886.44, "text": " where you take a flowing function, a smooth function", "tokens": [51070, 689, 291, 747, 257, 13974, 2445, 11, 257, 5508, 2445, 51248], "temperature": 0.0, "avg_logprob": -0.18882677835576675, "compression_ratio": 1.6870229007633588, "no_speech_prob": 0.004160813521593809}, {"id": 1991, "seek": 96876, "start": 6886.44, "end": 6890.5599999999995, "text": " that's defined over these, the real numbers basically,", "tokens": [51248, 300, 311, 7642, 670, 613, 11, 264, 957, 3547, 1936, 11, 51454], "temperature": 0.0, "avg_logprob": -0.18882677835576675, "compression_ratio": 1.6870229007633588, "no_speech_prob": 0.004160813521593809}, {"id": 1992, "seek": 96876, "start": 6890.5599999999995, "end": 6895.5599999999995, "text": " like you could have 0.001s and so on.", "tokens": [51454, 411, 291, 727, 362, 1958, 13, 628, 16, 82, 293, 370, 322, 13, 51704], "temperature": 0.0, "avg_logprob": -0.18882677835576675, "compression_ratio": 1.6870229007633588, "no_speech_prob": 0.004160813521593809}, {"id": 1993, "seek": 96876, "start": 6895.5599999999995, "end": 6896.8, "text": " And you have to convert that", "tokens": [51704, 400, 291, 362, 281, 7620, 300, 51766], "temperature": 0.0, "avg_logprob": -0.18882677835576675, "compression_ratio": 1.6870229007633588, "no_speech_prob": 0.004160813521593809}, {"id": 1994, "seek": 99680, "start": 6896.84, "end": 6899.92, "text": " so that you map it onto a discrete x-axis,", "tokens": [50366, 370, 300, 291, 4471, 309, 3911, 257, 27706, 2031, 12, 24633, 11, 50520], "temperature": 0.0, "avg_logprob": -0.1802538906721244, "compression_ratio": 1.7121771217712176, "no_speech_prob": 0.014116606675088406}, {"id": 1995, "seek": 99680, "start": 6899.92, "end": 6901.6, "text": " where you have token one, token two,", "tokens": [50520, 689, 291, 362, 14862, 472, 11, 14862, 732, 11, 50604], "temperature": 0.0, "avg_logprob": -0.1802538906721244, "compression_ratio": 1.7121771217712176, "no_speech_prob": 0.014116606675088406}, {"id": 1996, "seek": 99680, "start": 6901.6, "end": 6903.08, "text": " and there's no in between, right?", "tokens": [50604, 293, 456, 311, 572, 294, 1296, 11, 558, 30, 50678], "temperature": 0.0, "avg_logprob": -0.1802538906721244, "compression_ratio": 1.7121771217712176, "no_speech_prob": 0.014116606675088406}, {"id": 1997, "seek": 99680, "start": 6903.08, "end": 6905.0, "text": " So the core question here is gonna be,", "tokens": [50678, 407, 264, 4965, 1168, 510, 307, 799, 312, 11, 50774], "temperature": 0.0, "avg_logprob": -0.1802538906721244, "compression_ratio": 1.7121771217712176, "no_speech_prob": 0.014116606675088406}, {"id": 1998, "seek": 99680, "start": 6905.0, "end": 6910.0, "text": " how do you evolve the hidden states from token one to token two?", "tokens": [50774, 577, 360, 291, 16693, 264, 7633, 4368, 490, 14862, 472, 281, 14862, 732, 30, 51024], "temperature": 0.0, "avg_logprob": -0.1802538906721244, "compression_ratio": 1.7121771217712176, "no_speech_prob": 0.014116606675088406}, {"id": 1999, "seek": 99680, "start": 6910.6, "end": 6912.84, "text": " And how do you do it in a mathematically principled way?", "tokens": [51054, 400, 577, 360, 291, 360, 309, 294, 257, 44003, 3681, 15551, 636, 30, 51166], "temperature": 0.0, "avg_logprob": -0.1802538906721244, "compression_ratio": 1.7121771217712176, "no_speech_prob": 0.014116606675088406}, {"id": 2000, "seek": 99680, "start": 6912.84, "end": 6915.56, "text": " And integral calculus, enter zenithes,", "tokens": [51166, 400, 11573, 33400, 11, 3242, 37097, 355, 279, 11, 51302], "temperature": 0.0, "avg_logprob": -0.1802538906721244, "compression_ratio": 1.7121771217712176, "no_speech_prob": 0.014116606675088406}, {"id": 2001, "seek": 99680, "start": 6915.56, "end": 6917.8, "text": " I'm not gonna get into the weeds too too much here,", "tokens": [51302, 286, 478, 406, 799, 483, 666, 264, 26370, 886, 886, 709, 510, 11, 51414], "temperature": 0.0, "avg_logprob": -0.1802538906721244, "compression_ratio": 1.7121771217712176, "no_speech_prob": 0.014116606675088406}, {"id": 2002, "seek": 99680, "start": 6917.8, "end": 6920.96, "text": " other than to say that if you're gonna do that,", "tokens": [51414, 661, 813, 281, 584, 300, 498, 291, 434, 799, 360, 300, 11, 51572], "temperature": 0.0, "avg_logprob": -0.1802538906721244, "compression_ratio": 1.7121771217712176, "no_speech_prob": 0.014116606675088406}, {"id": 2003, "seek": 99680, "start": 6920.96, "end": 6924.32, "text": " as you might imagine, if you wanna like discretize", "tokens": [51572, 382, 291, 1062, 3811, 11, 498, 291, 1948, 411, 25656, 1125, 51740], "temperature": 0.0, "avg_logprob": -0.1802538906721244, "compression_ratio": 1.7121771217712176, "no_speech_prob": 0.014116606675088406}, {"id": 2004, "seek": 102432, "start": 6924.32, "end": 6926.8, "text": " a smooth function, and in other words,", "tokens": [50364, 257, 5508, 2445, 11, 293, 294, 661, 2283, 11, 50488], "temperature": 0.0, "avg_logprob": -0.1515504472395953, "compression_ratio": 2.0681818181818183, "no_speech_prob": 0.0054213120602071285}, {"id": 2005, "seek": 102432, "start": 6926.8, "end": 6931.32, "text": " just basically chunk it up into these kind of discrete pieces.", "tokens": [50488, 445, 1936, 16635, 309, 493, 666, 613, 733, 295, 27706, 3755, 13, 50714], "temperature": 0.0, "avg_logprob": -0.1515504472395953, "compression_ratio": 2.0681818181818183, "no_speech_prob": 0.0054213120602071285}, {"id": 2006, "seek": 102432, "start": 6931.32, "end": 6934.48, "text": " You kind of have a choice, a token one,", "tokens": [50714, 509, 733, 295, 362, 257, 3922, 11, 257, 14862, 472, 11, 50872], "temperature": 0.0, "avg_logprob": -0.1515504472395953, "compression_ratio": 2.0681818181818183, "no_speech_prob": 0.0054213120602071285}, {"id": 2007, "seek": 102432, "start": 6934.48, "end": 6937.48, "text": " you could sort of choose, roughly speaking,", "tokens": [50872, 291, 727, 1333, 295, 2826, 11, 9810, 4124, 11, 51022], "temperature": 0.0, "avg_logprob": -0.1515504472395953, "compression_ratio": 2.0681818181818183, "no_speech_prob": 0.0054213120602071285}, {"id": 2008, "seek": 102432, "start": 6937.48, "end": 6941.24, "text": " the sort of leftmost limit of the smooth function,", "tokens": [51022, 264, 1333, 295, 1411, 1761, 4948, 295, 264, 5508, 2445, 11, 51210], "temperature": 0.0, "avg_logprob": -0.1515504472395953, "compression_ratio": 2.0681818181818183, "no_speech_prob": 0.0054213120602071285}, {"id": 2009, "seek": 102432, "start": 6941.24, "end": 6943.32, "text": " the value of the smooth function that would have been there,", "tokens": [51210, 264, 2158, 295, 264, 5508, 2445, 300, 576, 362, 668, 456, 11, 51314], "temperature": 0.0, "avg_logprob": -0.1515504472395953, "compression_ratio": 2.0681818181818183, "no_speech_prob": 0.0054213120602071285}, {"id": 2010, "seek": 102432, "start": 6943.32, "end": 6945.8, "text": " you can sort of choose that to approximate", "tokens": [51314, 291, 393, 1333, 295, 2826, 300, 281, 30874, 51438], "temperature": 0.0, "avg_logprob": -0.1515504472395953, "compression_ratio": 2.0681818181818183, "no_speech_prob": 0.0054213120602071285}, {"id": 2011, "seek": 102432, "start": 6945.8, "end": 6946.8, "text": " the value of token one.", "tokens": [51438, 264, 2158, 295, 14862, 472, 13, 51488], "temperature": 0.0, "avg_logprob": -0.1515504472395953, "compression_ratio": 2.0681818181818183, "no_speech_prob": 0.0054213120602071285}, {"id": 2012, "seek": 102432, "start": 6946.8, "end": 6948.88, "text": " You could choose the rightmost limit", "tokens": [51488, 509, 727, 2826, 264, 558, 1761, 4948, 51592], "temperature": 0.0, "avg_logprob": -0.1515504472395953, "compression_ratio": 2.0681818181818183, "no_speech_prob": 0.0054213120602071285}, {"id": 2013, "seek": 102432, "start": 6948.88, "end": 6953.08, "text": " of your sort of discrete bar as the kind of the value", "tokens": [51592, 295, 428, 1333, 295, 27706, 2159, 382, 264, 733, 295, 264, 2158, 51802], "temperature": 0.0, "avg_logprob": -0.1515504472395953, "compression_ratio": 2.0681818181818183, "no_speech_prob": 0.0054213120602071285}, {"id": 2014, "seek": 105308, "start": 6953.08, "end": 6954.2, "text": " that you described token one,", "tokens": [50364, 300, 291, 7619, 14862, 472, 11, 50420], "temperature": 0.0, "avg_logprob": -0.16606251745416015, "compression_ratio": 1.8501628664495113, "no_speech_prob": 0.0022659164387732744}, {"id": 2015, "seek": 105308, "start": 6954.2, "end": 6956.68, "text": " or you could sort of average the two together.", "tokens": [50420, 420, 291, 727, 1333, 295, 4274, 264, 732, 1214, 13, 50544], "temperature": 0.0, "avg_logprob": -0.16606251745416015, "compression_ratio": 1.8501628664495113, "no_speech_prob": 0.0022659164387732744}, {"id": 2016, "seek": 105308, "start": 6956.68, "end": 6959.36, "text": " Historically, people have used this exponential", "tokens": [50544, 25108, 984, 11, 561, 362, 1143, 341, 21510, 50678], "temperature": 0.0, "avg_logprob": -0.16606251745416015, "compression_ratio": 1.8501628664495113, "no_speech_prob": 0.0022659164387732744}, {"id": 2017, "seek": 105308, "start": 6959.36, "end": 6961.8, "text": " Euler way, this was the Mamba two way of doing it,", "tokens": [50678, 462, 26318, 636, 11, 341, 390, 264, 376, 23337, 732, 636, 295, 884, 309, 11, 50800], "temperature": 0.0, "avg_logprob": -0.16606251745416015, "compression_ratio": 1.8501628664495113, "no_speech_prob": 0.0022659164387732744}, {"id": 2018, "seek": 105308, "start": 6961.8, "end": 6963.36, "text": " where they basically just like,", "tokens": [50800, 689, 436, 1936, 445, 411, 11, 50878], "temperature": 0.0, "avg_logprob": -0.16606251745416015, "compression_ratio": 1.8501628664495113, "no_speech_prob": 0.0022659164387732744}, {"id": 2019, "seek": 105308, "start": 6963.36, "end": 6965.48, "text": " they do the very, very first method,", "tokens": [50878, 436, 360, 264, 588, 11, 588, 700, 3170, 11, 50984], "temperature": 0.0, "avg_logprob": -0.16606251745416015, "compression_ratio": 1.8501628664495113, "no_speech_prob": 0.0022659164387732744}, {"id": 2020, "seek": 105308, "start": 6965.48, "end": 6967.639999999999, "text": " so they basically give it the right endpoint,", "tokens": [50984, 370, 436, 1936, 976, 309, 264, 558, 35795, 11, 51092], "temperature": 0.0, "avg_logprob": -0.16606251745416015, "compression_ratio": 1.8501628664495113, "no_speech_prob": 0.0022659164387732744}, {"id": 2021, "seek": 105308, "start": 6967.639999999999, "end": 6970.88, "text": " so they assume that the input value is, anyway,", "tokens": [51092, 370, 436, 6552, 300, 264, 4846, 2158, 307, 11, 4033, 11, 51254], "temperature": 0.0, "avg_logprob": -0.16606251745416015, "compression_ratio": 1.8501628664495113, "no_speech_prob": 0.0022659164387732744}, {"id": 2022, "seek": 105308, "start": 6970.88, "end": 6971.96, "text": " the details don't really matter,", "tokens": [51254, 264, 4365, 500, 380, 534, 1871, 11, 51308], "temperature": 0.0, "avg_logprob": -0.16606251745416015, "compression_ratio": 1.8501628664495113, "no_speech_prob": 0.0022659164387732744}, {"id": 2023, "seek": 105308, "start": 6971.96, "end": 6974.68, "text": " but the input value is constant across this whole interval,", "tokens": [51308, 457, 264, 4846, 2158, 307, 5754, 2108, 341, 1379, 15035, 11, 51444], "temperature": 0.0, "avg_logprob": -0.16606251745416015, "compression_ratio": 1.8501628664495113, "no_speech_prob": 0.0022659164387732744}, {"id": 2024, "seek": 105308, "start": 6974.68, "end": 6977.0, "text": " and equal to its value at the right endpoint,", "tokens": [51444, 293, 2681, 281, 1080, 2158, 412, 264, 558, 35795, 11, 51560], "temperature": 0.0, "avg_logprob": -0.16606251745416015, "compression_ratio": 1.8501628664495113, "no_speech_prob": 0.0022659164387732744}, {"id": 2025, "seek": 105308, "start": 6977.0, "end": 6980.04, "text": " and that basically simplifies a bunch of math,", "tokens": [51560, 293, 300, 1936, 6883, 11221, 257, 3840, 295, 5221, 11, 51712], "temperature": 0.0, "avg_logprob": -0.16606251745416015, "compression_ratio": 1.8501628664495113, "no_speech_prob": 0.0022659164387732744}, {"id": 2026, "seek": 105308, "start": 6980.04, "end": 6982.24, "text": " and it makes it possible for them to define", "tokens": [51712, 293, 309, 1669, 309, 1944, 337, 552, 281, 6964, 51822], "temperature": 0.0, "avg_logprob": -0.16606251745416015, "compression_ratio": 1.8501628664495113, "no_speech_prob": 0.0022659164387732744}, {"id": 2027, "seek": 108224, "start": 6982.32, "end": 6985.64, "text": " their update rule, but there's a better way, basically,", "tokens": [50368, 641, 5623, 4978, 11, 457, 456, 311, 257, 1101, 636, 11, 1936, 11, 50534], "temperature": 0.0, "avg_logprob": -0.18458588738910486, "compression_ratio": 1.7380073800738007, "no_speech_prob": 0.0023616126272827387}, {"id": 2028, "seek": 108224, "start": 6985.64, "end": 6988.04, "text": " and it involves accounting for both the right", "tokens": [50534, 293, 309, 11626, 19163, 337, 1293, 264, 558, 50654], "temperature": 0.0, "avg_logprob": -0.18458588738910486, "compression_ratio": 1.7380073800738007, "no_speech_prob": 0.0023616126272827387}, {"id": 2029, "seek": 108224, "start": 6988.04, "end": 6991.64, "text": " and the left limit, doing a weighted combination of the two,", "tokens": [50654, 293, 264, 1411, 4948, 11, 884, 257, 32807, 6562, 295, 264, 732, 11, 50834], "temperature": 0.0, "avg_logprob": -0.18458588738910486, "compression_ratio": 1.7380073800738007, "no_speech_prob": 0.0023616126272827387}, {"id": 2030, "seek": 108224, "start": 6991.64, "end": 6994.4400000000005, "text": " so that you're not just like saying, okay,", "tokens": [50834, 370, 300, 291, 434, 406, 445, 411, 1566, 11, 1392, 11, 50974], "temperature": 0.0, "avg_logprob": -0.18458588738910486, "compression_ratio": 1.7380073800738007, "no_speech_prob": 0.0023616126272827387}, {"id": 2031, "seek": 108224, "start": 6994.4400000000005, "end": 6996.76, "text": " for this interval, a corresponds to token one,", "tokens": [50974, 337, 341, 15035, 11, 257, 23249, 281, 14862, 472, 11, 51090], "temperature": 0.0, "avg_logprob": -0.18458588738910486, "compression_ratio": 1.7380073800738007, "no_speech_prob": 0.0023616126272827387}, {"id": 2032, "seek": 108224, "start": 6996.76, "end": 7000.28, "text": " I'm just gonna go with whatever the rightmost fringe", "tokens": [51090, 286, 478, 445, 799, 352, 365, 2035, 264, 558, 1761, 38764, 51266], "temperature": 0.0, "avg_logprob": -0.18458588738910486, "compression_ratio": 1.7380073800738007, "no_speech_prob": 0.0023616126272827387}, {"id": 2033, "seek": 108224, "start": 7000.28, "end": 7003.96, "text": " of the token one time boundary, yeah,", "tokens": [51266, 295, 264, 14862, 472, 565, 12866, 11, 1338, 11, 51450], "temperature": 0.0, "avg_logprob": -0.18458588738910486, "compression_ratio": 1.7380073800738007, "no_speech_prob": 0.0023616126272827387}, {"id": 2034, "seek": 108224, "start": 7003.96, "end": 7005.72, "text": " mapping onto the continuous function would be,", "tokens": [51450, 18350, 3911, 264, 10957, 2445, 576, 312, 11, 51538], "temperature": 0.0, "avg_logprob": -0.18458588738910486, "compression_ratio": 1.7380073800738007, "no_speech_prob": 0.0023616126272827387}, {"id": 2035, "seek": 108224, "start": 7005.72, "end": 7008.12, "text": " instead you balance the right and the left limits", "tokens": [51538, 2602, 291, 4772, 264, 558, 293, 264, 1411, 10406, 51658], "temperature": 0.0, "avg_logprob": -0.18458588738910486, "compression_ratio": 1.7380073800738007, "no_speech_prob": 0.0023616126272827387}, {"id": 2036, "seek": 108224, "start": 7008.12, "end": 7009.84, "text": " of that bar to get your value,", "tokens": [51658, 295, 300, 2159, 281, 483, 428, 2158, 11, 51744], "temperature": 0.0, "avg_logprob": -0.18458588738910486, "compression_ratio": 1.7380073800738007, "no_speech_prob": 0.0023616126272827387}, {"id": 2037, "seek": 110984, "start": 7009.84, "end": 7012.4, "text": " and that's what Mamba 3 is doing, fundamentally,", "tokens": [50364, 293, 300, 311, 437, 376, 23337, 805, 307, 884, 11, 17879, 11, 50492], "temperature": 0.0, "avg_logprob": -0.16551098359369598, "compression_ratio": 1.8855932203389831, "no_speech_prob": 0.0014764635125175118}, {"id": 2038, "seek": 110984, "start": 7012.4, "end": 7014.92, "text": " it's anyway one of the big changes,", "tokens": [50492, 309, 311, 4033, 472, 295, 264, 955, 2962, 11, 50618], "temperature": 0.0, "avg_logprob": -0.16551098359369598, "compression_ratio": 1.8855932203389831, "no_speech_prob": 0.0014764635125175118}, {"id": 2039, "seek": 110984, "start": 7014.92, "end": 7018.2, "text": " another one is parity, so previously,", "tokens": [50618, 1071, 472, 307, 44747, 11, 370, 8046, 11, 50782], "temperature": 0.0, "avg_logprob": -0.16551098359369598, "compression_ratio": 1.8855932203389831, "no_speech_prob": 0.0014764635125175118}, {"id": 2040, "seek": 110984, "start": 7018.2, "end": 7022.92, "text": " Mamba models could only represent their internal states", "tokens": [50782, 376, 23337, 5245, 727, 787, 2906, 641, 6920, 4368, 51018], "temperature": 0.0, "avg_logprob": -0.16551098359369598, "compression_ratio": 1.8855932203389831, "no_speech_prob": 0.0014764635125175118}, {"id": 2041, "seek": 110984, "start": 7022.92, "end": 7026.44, "text": " using real numbers, and so real numbers", "tokens": [51018, 1228, 957, 3547, 11, 293, 370, 957, 3547, 51194], "temperature": 0.0, "avg_logprob": -0.16551098359369598, "compression_ratio": 1.8855932203389831, "no_speech_prob": 0.0014764635125175118}, {"id": 2042, "seek": 110984, "start": 7026.44, "end": 7030.12, "text": " are one kind of number, there's also imaginary numbers,", "tokens": [51194, 366, 472, 733, 295, 1230, 11, 456, 311, 611, 26164, 3547, 11, 51378], "temperature": 0.0, "avg_logprob": -0.16551098359369598, "compression_ratio": 1.8855932203389831, "no_speech_prob": 0.0014764635125175118}, {"id": 2043, "seek": 110984, "start": 7030.12, "end": 7034.68, "text": " and imaginary numbers, like I is the square root of negative one,", "tokens": [51378, 293, 26164, 3547, 11, 411, 286, 307, 264, 3732, 5593, 295, 3671, 472, 11, 51606], "temperature": 0.0, "avg_logprob": -0.16551098359369598, "compression_ratio": 1.8855932203389831, "no_speech_prob": 0.0014764635125175118}, {"id": 2044, "seek": 110984, "start": 7034.68, "end": 7038.24, "text": " they're also multiple of the square root of negative one,", "tokens": [51606, 436, 434, 611, 3866, 295, 264, 3732, 5593, 295, 3671, 472, 11, 51784], "temperature": 0.0, "avg_logprob": -0.16551098359369598, "compression_ratio": 1.8855932203389831, "no_speech_prob": 0.0014764635125175118}, {"id": 2045, "seek": 110984, "start": 7038.24, "end": 7039.68, "text": " if you're not familiar with imaginary numbers,", "tokens": [51784, 498, 291, 434, 406, 4963, 365, 26164, 3547, 11, 51856], "temperature": 0.0, "avg_logprob": -0.16551098359369598, "compression_ratio": 1.8855932203389831, "no_speech_prob": 0.0014764635125175118}, {"id": 2046, "seek": 113968, "start": 7039.72, "end": 7041.92, "text": " this is probably not the place to learn about it,", "tokens": [50366, 341, 307, 1391, 406, 264, 1081, 281, 1466, 466, 309, 11, 50476], "temperature": 0.0, "avg_logprob": -0.12313822021178149, "compression_ratio": 2.0214592274678114, "no_speech_prob": 0.0005624069599434733}, {"id": 2047, "seek": 113968, "start": 7041.92, "end": 7044.08, "text": " but there's this deep and intimate connection", "tokens": [50476, 457, 456, 311, 341, 2452, 293, 20215, 4984, 50584], "temperature": 0.0, "avg_logprob": -0.12313822021178149, "compression_ratio": 2.0214592274678114, "no_speech_prob": 0.0005624069599434733}, {"id": 2048, "seek": 113968, "start": 7044.08, "end": 7049.04, "text": " between imaginary numbers and the concept of rotating stuff,", "tokens": [50584, 1296, 26164, 3547, 293, 264, 3410, 295, 19627, 1507, 11, 50832], "temperature": 0.0, "avg_logprob": -0.12313822021178149, "compression_ratio": 2.0214592274678114, "no_speech_prob": 0.0005624069599434733}, {"id": 2049, "seek": 113968, "start": 7049.04, "end": 7052.04, "text": " and essentially what this allows the model to do,", "tokens": [50832, 293, 4476, 437, 341, 4045, 264, 2316, 281, 360, 11, 50982], "temperature": 0.0, "avg_logprob": -0.12313822021178149, "compression_ratio": 2.0214592274678114, "no_speech_prob": 0.0005624069599434733}, {"id": 2050, "seek": 113968, "start": 7052.04, "end": 7055.2, "text": " so what they're gonna do is use imaginary numbers", "tokens": [50982, 370, 437, 436, 434, 799, 360, 307, 764, 26164, 3547, 51140], "temperature": 0.0, "avg_logprob": -0.12313822021178149, "compression_ratio": 2.0214592274678114, "no_speech_prob": 0.0005624069599434733}, {"id": 2051, "seek": 113968, "start": 7055.2, "end": 7057.6, "text": " and real numbers, sometimes referred to collectively", "tokens": [51140, 293, 957, 3547, 11, 2171, 10839, 281, 24341, 51260], "temperature": 0.0, "avg_logprob": -0.12313822021178149, "compression_ratio": 2.0214592274678114, "no_speech_prob": 0.0005624069599434733}, {"id": 2052, "seek": 113968, "start": 7057.6, "end": 7060.360000000001, "text": " as complex numbers, they're gonna use imaginary", "tokens": [51260, 382, 3997, 3547, 11, 436, 434, 799, 764, 26164, 51398], "temperature": 0.0, "avg_logprob": -0.12313822021178149, "compression_ratio": 2.0214592274678114, "no_speech_prob": 0.0005624069599434733}, {"id": 2053, "seek": 113968, "start": 7060.360000000001, "end": 7062.72, "text": " and real numbers together and allow the model", "tokens": [51398, 293, 957, 3547, 1214, 293, 2089, 264, 2316, 51516], "temperature": 0.0, "avg_logprob": -0.12313822021178149, "compression_ratio": 2.0214592274678114, "no_speech_prob": 0.0005624069599434733}, {"id": 2054, "seek": 113968, "start": 7062.72, "end": 7065.8, "text": " to represent imaginary and real numbers,", "tokens": [51516, 281, 2906, 26164, 293, 957, 3547, 11, 51670], "temperature": 0.0, "avg_logprob": -0.12313822021178149, "compression_ratio": 2.0214592274678114, "no_speech_prob": 0.0005624069599434733}, {"id": 2055, "seek": 113968, "start": 7065.8, "end": 7067.6, "text": " and in its internal state,", "tokens": [51670, 293, 294, 1080, 6920, 1785, 11, 51760], "temperature": 0.0, "avg_logprob": -0.12313822021178149, "compression_ratio": 2.0214592274678114, "no_speech_prob": 0.0005624069599434733}, {"id": 2056, "seek": 116760, "start": 7067.6, "end": 7069.76, "text": " and for interesting mathematical reasons,", "tokens": [50364, 293, 337, 1880, 18894, 4112, 11, 50472], "temperature": 0.0, "avg_logprob": -0.12536366357870982, "compression_ratio": 1.795221843003413, "no_speech_prob": 0.00048407033318653703}, {"id": 2057, "seek": 116760, "start": 7069.76, "end": 7073.56, "text": " this makes it possible for the model to actively track", "tokens": [50472, 341, 1669, 309, 1944, 337, 264, 2316, 281, 13022, 2837, 50662], "temperature": 0.0, "avg_logprob": -0.12536366357870982, "compression_ratio": 1.795221843003413, "no_speech_prob": 0.00048407033318653703}, {"id": 2058, "seek": 116760, "start": 7073.56, "end": 7075.8, "text": " property called parity, basically,", "tokens": [50662, 4707, 1219, 44747, 11, 1936, 11, 50774], "temperature": 0.0, "avg_logprob": -0.12536366357870982, "compression_ratio": 1.795221843003413, "no_speech_prob": 0.00048407033318653703}, {"id": 2059, "seek": 116760, "start": 7075.8, "end": 7079.76, "text": " if you see a sequence of zeros and ones", "tokens": [50774, 498, 291, 536, 257, 8310, 295, 35193, 293, 2306, 50972], "temperature": 0.0, "avg_logprob": -0.12536366357870982, "compression_ratio": 1.795221843003413, "no_speech_prob": 0.00048407033318653703}, {"id": 2060, "seek": 116760, "start": 7079.76, "end": 7082.12, "text": " that you feed to the model, and you ask the model,", "tokens": [50972, 300, 291, 3154, 281, 264, 2316, 11, 293, 291, 1029, 264, 2316, 11, 51090], "temperature": 0.0, "avg_logprob": -0.12536366357870982, "compression_ratio": 1.795221843003413, "no_speech_prob": 0.00048407033318653703}, {"id": 2061, "seek": 116760, "start": 7082.12, "end": 7084.4, "text": " hey, if you add up all these numbers,", "tokens": [51090, 4177, 11, 498, 291, 909, 493, 439, 613, 3547, 11, 51204], "temperature": 0.0, "avg_logprob": -0.12536366357870982, "compression_ratio": 1.795221843003413, "no_speech_prob": 0.00048407033318653703}, {"id": 2062, "seek": 116760, "start": 7084.4, "end": 7086.0, "text": " are they even or odd?", "tokens": [51204, 366, 436, 754, 420, 7401, 30, 51284], "temperature": 0.0, "avg_logprob": -0.12536366357870982, "compression_ratio": 1.795221843003413, "no_speech_prob": 0.00048407033318653703}, {"id": 2063, "seek": 116760, "start": 7086.0, "end": 7088.88, "text": " Mamba 2 would fail, because it wouldn't be able to,", "tokens": [51284, 376, 23337, 568, 576, 3061, 11, 570, 309, 2759, 380, 312, 1075, 281, 11, 51428], "temperature": 0.0, "avg_logprob": -0.12536366357870982, "compression_ratio": 1.795221843003413, "no_speech_prob": 0.00048407033318653703}, {"id": 2064, "seek": 116760, "start": 7088.88, "end": 7091.0, "text": " basically, do this rotation operation", "tokens": [51428, 1936, 11, 360, 341, 12447, 6916, 51534], "temperature": 0.0, "avg_logprob": -0.12536366357870982, "compression_ratio": 1.795221843003413, "no_speech_prob": 0.00048407033318653703}, {"id": 2065, "seek": 0, "start": 7091.0, "end": 7094.6, "text": " that's required to flip the parity as you count because really that's all you're going", "tokens": [50914, 300, 311, 4739, 281, 7929, 264, 44747, 382, 291, 1207, 570, 534, 300, 311, 439, 291, 434, 516, 51094], "temperature": 0.0, "avg_logprob": -0.19746015845118342, "compression_ratio": 1.7666666666666666, "no_speech_prob": 0.1461373120546341}, {"id": 2066, "seek": 0, "start": 7094.6, "end": 7097.76, "text": " to do when you're trying to figure out if the number is even or odd and you're going", "tokens": [51094, 281, 360, 562, 291, 434, 1382, 281, 2573, 484, 498, 264, 1230, 307, 754, 420, 7401, 293, 291, 434, 516, 51252], "temperature": 0.0, "avg_logprob": -0.19746015845118342, "compression_ratio": 1.7666666666666666, "no_speech_prob": 0.1461373120546341}, {"id": 2067, "seek": 0, "start": 7097.76, "end": 7102.52, "text": " to go, okay, well, like, you know, as I go, I flip every time I see a one and a zero doesn't", "tokens": [51252, 281, 352, 11, 1392, 11, 731, 11, 411, 11, 291, 458, 11, 382, 286, 352, 11, 286, 7929, 633, 565, 286, 536, 257, 472, 293, 257, 4018, 1177, 380, 51490], "temperature": 0.0, "avg_logprob": -0.19746015845118342, "compression_ratio": 1.7666666666666666, "no_speech_prob": 0.1461373120546341}, {"id": 2068, "seek": 0, "start": 7102.52, "end": 7103.52, "text": " do anything.", "tokens": [51490, 360, 1340, 13, 51540], "temperature": 0.0, "avg_logprob": -0.19746015845118342, "compression_ratio": 1.7666666666666666, "no_speech_prob": 0.1461373120546341}, {"id": 2069, "seek": 0, "start": 7103.52, "end": 7106.52, "text": " Anyway, I'm going to just say the details don't matter.", "tokens": [51540, 5684, 11, 286, 478, 516, 281, 445, 584, 264, 4365, 500, 380, 1871, 13, 51690], "temperature": 0.0, "avg_logprob": -0.19746015845118342, "compression_ratio": 1.7666666666666666, "no_speech_prob": 0.1461373120546341}, {"id": 2070, "seek": 2652, "start": 7106.52, "end": 7113.12, "text": " You can hopefully see this is a mathematically very interesting and elegant paper consistent", "tokens": [50364, 509, 393, 4696, 536, 341, 307, 257, 44003, 588, 1880, 293, 21117, 3035, 8398, 50694], "temperature": 0.0, "avg_logprob": -0.18299750190588734, "compression_ratio": 1.61875, "no_speech_prob": 0.009302576072514057}, {"id": 2071, "seek": 2652, "start": 7113.12, "end": 7117.44, "text": " with previous iterations of Mamba, but a much more principled one and the results are really", "tokens": [50694, 365, 3894, 36540, 295, 376, 23337, 11, 457, 257, 709, 544, 3681, 15551, 472, 293, 264, 3542, 366, 534, 50910], "temperature": 0.0, "avg_logprob": -0.18299750190588734, "compression_ratio": 1.61875, "no_speech_prob": 0.009302576072514057}, {"id": 2072, "seek": 2652, "start": 7117.44, "end": 7118.44, "text": " impressive.", "tokens": [50910, 8992, 13, 50960], "temperature": 0.0, "avg_logprob": -0.18299750190588734, "compression_ratio": 1.61875, "no_speech_prob": 0.009302576072514057}, {"id": 2073, "seek": 2652, "start": 7118.44, "end": 7122.56, "text": " It beats transformers by over two points on average on downstream accuracy across a whole", "tokens": [50960, 467, 16447, 4088, 433, 538, 670, 732, 2793, 322, 4274, 322, 30621, 14170, 2108, 257, 1379, 51166], "temperature": 0.0, "avg_logprob": -0.18299750190588734, "compression_ratio": 1.61875, "no_speech_prob": 0.009302576072514057}, {"id": 2074, "seek": 2652, "start": 7122.56, "end": 7124.36, "text": " bunch of benchmarks.", "tokens": [51166, 3840, 295, 43751, 13, 51256], "temperature": 0.0, "avg_logprob": -0.18299750190588734, "compression_ratio": 1.61875, "no_speech_prob": 0.009302576072514057}, {"id": 2075, "seek": 2652, "start": 7124.36, "end": 7128.56, "text": " They beat Mamba 2 by 1.9 points on those benchmarks and the same perplexity is Mamba", "tokens": [51256, 814, 4224, 376, 23337, 568, 538, 502, 13, 24, 2793, 322, 729, 43751, 293, 264, 912, 680, 18945, 507, 307, 376, 23337, 51466], "temperature": 0.0, "avg_logprob": -0.18299750190588734, "compression_ratio": 1.61875, "no_speech_prob": 0.009302576072514057}, {"id": 2076, "seek": 2652, "start": 7128.56, "end": 7131.32, "text": " 2 with half the state size, right?", "tokens": [51466, 568, 365, 1922, 264, 1785, 2744, 11, 558, 30, 51604], "temperature": 0.0, "avg_logprob": -0.18299750190588734, "compression_ratio": 1.61875, "no_speech_prob": 0.009302576072514057}, {"id": 2077, "seek": 2652, "start": 7131.32, "end": 7136.36, "text": " So way way faster, I mean, that means it's twice as fast at inference for like equivalent", "tokens": [51604, 407, 636, 636, 4663, 11, 286, 914, 11, 300, 1355, 309, 311, 6091, 382, 2370, 412, 38253, 337, 411, 10344, 51856], "temperature": 0.0, "avg_logprob": -0.18299750190588734, "compression_ratio": 1.61875, "no_speech_prob": 0.009302576072514057}, {"id": 2078, "seek": 5636, "start": 7137.28, "end": 7141.08, "text": " and it has this property of solving all these parity and modular arithmetic tasks that", "tokens": [50410, 293, 309, 575, 341, 4707, 295, 12606, 439, 613, 44747, 293, 31111, 42973, 9608, 300, 50600], "temperature": 0.0, "avg_logprob": -0.15120975022177094, "compression_ratio": 1.6133828996282529, "no_speech_prob": 0.013901620171964169}, {"id": 2079, "seek": 5636, "start": 7141.08, "end": 7145.28, "text": " we just talked about that may have been very poorly explained, but you kind of at a certain", "tokens": [50600, 321, 445, 2825, 466, 300, 815, 362, 668, 588, 22271, 8825, 11, 457, 291, 733, 295, 412, 257, 1629, 50810], "temperature": 0.0, "avg_logprob": -0.15120975022177094, "compression_ratio": 1.6133828996282529, "no_speech_prob": 0.013901620171964169}, {"id": 2080, "seek": 5636, "start": 7145.28, "end": 7151.28, "text": " point, it goes into, yeah, you just have to be happy with complex numbers and stuff.", "tokens": [50810, 935, 11, 309, 1709, 666, 11, 1338, 11, 291, 445, 362, 281, 312, 2055, 365, 3997, 3547, 293, 1507, 13, 51110], "temperature": 0.0, "avg_logprob": -0.15120975022177094, "compression_ratio": 1.6133828996282529, "no_speech_prob": 0.013901620171964169}, {"id": 2081, "seek": 5636, "start": 7151.28, "end": 7154.04, "text": " Bottom line is this is an interesting, interesting development.", "tokens": [51110, 38289, 1622, 307, 341, 307, 364, 1880, 11, 1880, 3250, 13, 51248], "temperature": 0.0, "avg_logprob": -0.15120975022177094, "compression_ratio": 1.6133828996282529, "no_speech_prob": 0.013901620171964169}, {"id": 2082, "seek": 5636, "start": 7154.04, "end": 7157.04, "text": " It does come with a sort of optimization.", "tokens": [51248, 467, 775, 808, 365, 257, 1333, 295, 19618, 13, 51398], "temperature": 0.0, "avg_logprob": -0.15120975022177094, "compression_ratio": 1.6133828996282529, "no_speech_prob": 0.013901620171964169}, {"id": 2083, "seek": 5636, "start": 7157.04, "end": 7161.92, "text": " So, you know, previously Mamba used single input, single output.", "tokens": [51398, 407, 11, 291, 458, 11, 8046, 376, 23337, 1143, 2167, 4846, 11, 2167, 5598, 13, 51642], "temperature": 0.0, "avg_logprob": -0.15120975022177094, "compression_ratio": 1.6133828996282529, "no_speech_prob": 0.013901620171964169}, {"id": 2084, "seek": 8192, "start": 7161.92, "end": 7169.72, "text": " There's also an optimization that Mamba 3 does called Mimo multi input, multi output.", "tokens": [50364, 821, 311, 611, 364, 19618, 300, 376, 23337, 805, 775, 1219, 376, 6934, 4825, 4846, 11, 4825, 5598, 13, 50754], "temperature": 0.0, "avg_logprob": -0.21910027046313232, "compression_ratio": 1.6422018348623852, "no_speech_prob": 0.004321310203522444}, {"id": 2085, "seek": 8192, "start": 7169.72, "end": 7177.6, "text": " This is basically an approach that helps you paralyze some of the work that the Mamba", "tokens": [50754, 639, 307, 1936, 364, 3109, 300, 3665, 291, 32645, 1381, 512, 295, 264, 589, 300, 264, 376, 23337, 51148], "temperature": 0.0, "avg_logprob": -0.21910027046313232, "compression_ratio": 1.6422018348623852, "no_speech_prob": 0.004321310203522444}, {"id": 2086, "seek": 8192, "start": 7177.6, "end": 7178.6, "text": " algorithm is going to do.", "tokens": [51148, 9284, 307, 516, 281, 360, 13, 51198], "temperature": 0.0, "avg_logprob": -0.21910027046313232, "compression_ratio": 1.6422018348623852, "no_speech_prob": 0.004321310203522444}, {"id": 2087, "seek": 8192, "start": 7178.6, "end": 7185.04, "text": " So standard Mamba uses the single input, single output approach where the state update,", "tokens": [51198, 407, 3832, 376, 23337, 4960, 264, 2167, 4846, 11, 2167, 5598, 3109, 689, 264, 1785, 5623, 11, 51520], "temperature": 0.0, "avg_logprob": -0.21910027046313232, "compression_ratio": 1.6422018348623852, "no_speech_prob": 0.004321310203522444}, {"id": 2088, "seek": 8192, "start": 7185.04, "end": 7189.24, "text": " well, it's updated kind of fairly inefficiently, hardware inefficiently.", "tokens": [51520, 731, 11, 309, 311, 10588, 733, 295, 6457, 43495, 356, 11, 8837, 43495, 356, 13, 51730], "temperature": 0.0, "avg_logprob": -0.21910027046313232, "compression_ratio": 1.6422018348623852, "no_speech_prob": 0.004321310203522444}, {"id": 2089, "seek": 10924, "start": 7189.24, "end": 7194.44, "text": " The GPU mostly sits idle during the decoding phase, whereas Mimo, this like multiple input,", "tokens": [50364, 440, 18407, 5240, 12696, 30650, 1830, 264, 979, 8616, 5574, 11, 9735, 376, 6934, 11, 341, 411, 3866, 4846, 11, 50624], "temperature": 0.0, "avg_logprob": -0.1773407437779882, "compression_ratio": 1.6890459363957597, "no_speech_prob": 0.03750726208090782}, {"id": 2090, "seek": 10924, "start": 7194.44, "end": 7196.52, "text": " multiple output, generalizes it.", "tokens": [50624, 3866, 5598, 11, 2674, 5660, 309, 13, 50728], "temperature": 0.0, "avg_logprob": -0.1773407437779882, "compression_ratio": 1.6890459363957597, "no_speech_prob": 0.03750726208090782}, {"id": 2091, "seek": 10924, "start": 7196.52, "end": 7202.12, "text": " So instead of like processing only one input and producing one output at a time, each", "tokens": [50728, 407, 2602, 295, 411, 9007, 787, 472, 4846, 293, 10501, 472, 5598, 412, 257, 565, 11, 1184, 51008], "temperature": 0.0, "avg_logprob": -0.1773407437779882, "compression_ratio": 1.6890459363957597, "no_speech_prob": 0.03750726208090782}, {"id": 2092, "seek": 10924, "start": 7202.12, "end": 7207.76, "text": " layer is going to process a bunch of inputs and a bunch of outputs simultaneously using", "tokens": [51008, 4583, 307, 516, 281, 1399, 257, 3840, 295, 15743, 293, 257, 3840, 295, 23930, 16561, 1228, 51290], "temperature": 0.0, "avg_logprob": -0.1773407437779882, "compression_ratio": 1.6890459363957597, "no_speech_prob": 0.03750726208090782}, {"id": 2093, "seek": 10924, "start": 7207.76, "end": 7210.36, "text": " matrix multiplication way more GPU friendly.", "tokens": [51290, 8141, 27290, 636, 544, 18407, 9208, 13, 51420], "temperature": 0.0, "avg_logprob": -0.1773407437779882, "compression_ratio": 1.6890459363957597, "no_speech_prob": 0.03750726208090782}, {"id": 2094, "seek": 10924, "start": 7210.36, "end": 7214.52, "text": " And the core thing here is it increases your GPU utilization.", "tokens": [51420, 400, 264, 4965, 551, 510, 307, 309, 8637, 428, 18407, 37074, 13, 51628], "temperature": 0.0, "avg_logprob": -0.1773407437779882, "compression_ratio": 1.6890459363957597, "no_speech_prob": 0.03750726208090782}, {"id": 2095, "seek": 10924, "start": 7214.52, "end": 7218.8, "text": " And you know, from a data center standpoint, that matters hugely, right?", "tokens": [51628, 400, 291, 458, 11, 490, 257, 1412, 3056, 15827, 11, 300, 7001, 27417, 11, 558, 30, 51842], "temperature": 0.0, "avg_logprob": -0.1773407437779882, "compression_ratio": 1.6890459363957597, "no_speech_prob": 0.03750726208090782}, {"id": 2096, "seek": 13880, "start": 7218.8, "end": 7222.28, "text": " Because you're basically all your GPUs are a fleet of workers.", "tokens": [50364, 1436, 291, 434, 1936, 439, 428, 18407, 82, 366, 257, 19396, 295, 5600, 13, 50538], "temperature": 0.0, "avg_logprob": -0.17859861627221107, "compression_ratio": 1.7435064935064934, "no_speech_prob": 0.016024839133024216}, {"id": 2097, "seek": 13880, "start": 7222.28, "end": 7226.52, "text": " And if you're not keeping your workers busy, it is literally the same thing from an", "tokens": [50538, 400, 498, 291, 434, 406, 5145, 428, 5600, 5856, 11, 309, 307, 3736, 264, 912, 551, 490, 364, 50750], "temperature": 0.0, "avg_logprob": -0.17859861627221107, "compression_ratio": 1.7435064935064934, "no_speech_prob": 0.016024839133024216}, {"id": 2098, "seek": 13880, "start": 7226.52, "end": 7232.52, "text": " op-ex standpoint as just like having a bunch of like employees at your company, like taking", "tokens": [50750, 999, 12, 3121, 15827, 382, 445, 411, 1419, 257, 3840, 295, 411, 6619, 412, 428, 2237, 11, 411, 1940, 51050], "temperature": 0.0, "avg_logprob": -0.17859861627221107, "compression_ratio": 1.7435064935064934, "no_speech_prob": 0.016024839133024216}, {"id": 2099, "seek": 13880, "start": 7232.52, "end": 7234.0, "text": " a coffee break all day.", "tokens": [51050, 257, 4982, 1821, 439, 786, 13, 51124], "temperature": 0.0, "avg_logprob": -0.17859861627221107, "compression_ratio": 1.7435064935064934, "no_speech_prob": 0.016024839133024216}, {"id": 2100, "seek": 13880, "start": 7234.0, "end": 7238.12, "text": " Like if they're not being utilized, then you're basically burning money like just by having", "tokens": [51124, 1743, 498, 436, 434, 406, 885, 28158, 11, 550, 291, 434, 1936, 9488, 1460, 411, 445, 538, 1419, 51330], "temperature": 0.0, "avg_logprob": -0.17859861627221107, "compression_ratio": 1.7435064935064934, "no_speech_prob": 0.016024839133024216}, {"id": 2101, "seek": 13880, "start": 7238.12, "end": 7239.12, "text": " them sit there.", "tokens": [51330, 552, 1394, 456, 13, 51380], "temperature": 0.0, "avg_logprob": -0.17859861627221107, "compression_ratio": 1.7435064935064934, "no_speech_prob": 0.016024839133024216}, {"id": 2102, "seek": 13880, "start": 7239.12, "end": 7243.2, "text": " And so the fact that they're able to bump up in this case up to four times more flops", "tokens": [51380, 400, 370, 264, 1186, 300, 436, 434, 1075, 281, 9961, 493, 294, 341, 1389, 493, 281, 1451, 1413, 544, 932, 3370, 51584], "temperature": 0.0, "avg_logprob": -0.17859861627221107, "compression_ratio": 1.7435064935064934, "no_speech_prob": 0.016024839133024216}, {"id": 2103, "seek": 13880, "start": 7243.2, "end": 7247.24, "text": " during decoding during inference with no meaningful increase in wall clock time.", "tokens": [51584, 1830, 979, 8616, 1830, 38253, 365, 572, 10995, 3488, 294, 2929, 7830, 565, 13, 51786], "temperature": 0.0, "avg_logprob": -0.17859861627221107, "compression_ratio": 1.7435064935064934, "no_speech_prob": 0.016024839133024216}, {"id": 2104, "seek": 16724, "start": 7247.24, "end": 7251.68, "text": " So this doesn't actually like delay increase latency, for example, for the user, and it", "tokens": [50364, 407, 341, 1177, 380, 767, 411, 8577, 3488, 27043, 11, 337, 1365, 11, 337, 264, 4195, 11, 293, 309, 50586], "temperature": 0.0, "avg_logprob": -0.2246688491354386, "compression_ratio": 1.7110389610389611, "no_speech_prob": 0.0023873846512287855}, {"id": 2105, "seek": 16724, "start": 7251.68, "end": 7253.12, "text": " also leads to better model quality.", "tokens": [50586, 611, 6689, 281, 1101, 2316, 3125, 13, 50658], "temperature": 0.0, "avg_logprob": -0.2246688491354386, "compression_ratio": 1.7110389610389611, "no_speech_prob": 0.0023873846512287855}, {"id": 2106, "seek": 16724, "start": 7253.12, "end": 7257.8, "text": " So this is an important development from an efficiency standpoint, from a cost of running", "tokens": [50658, 407, 341, 307, 364, 1021, 3250, 490, 364, 10493, 15827, 11, 490, 257, 2063, 295, 2614, 50892], "temperature": 0.0, "avg_logprob": -0.2246688491354386, "compression_ratio": 1.7110389610389611, "no_speech_prob": 0.0023873846512287855}, {"id": 2107, "seek": 16724, "start": 7257.8, "end": 7259.84, "text": " this model standpoint as well.", "tokens": [50892, 341, 2316, 15827, 382, 731, 13, 50994], "temperature": 0.0, "avg_logprob": -0.2246688491354386, "compression_ratio": 1.7110389610389611, "no_speech_prob": 0.0023873846512287855}, {"id": 2108, "seek": 16724, "start": 7259.84, "end": 7265.04, "text": " So you know, we're seeing tons of hybrid models right now popping up with Mamba 2 and transform", "tokens": [50994, 407, 291, 458, 11, 321, 434, 2577, 9131, 295, 13051, 5245, 558, 586, 18374, 493, 365, 376, 23337, 568, 293, 4088, 51254], "temperature": 0.0, "avg_logprob": -0.2246688491354386, "compression_ratio": 1.7110389610389611, "no_speech_prob": 0.0023873846512287855}, {"id": 2109, "seek": 16724, "start": 7265.04, "end": 7268.76, "text": " our architectures typically merged together and a whole bunch of variants, you know, sometimes", "tokens": [51254, 527, 6331, 1303, 5850, 36427, 1214, 293, 257, 1379, 3840, 295, 21669, 11, 291, 458, 11, 2171, 51440], "temperature": 0.0, "avg_logprob": -0.2246688491354386, "compression_ratio": 1.7110389610389611, "no_speech_prob": 0.0023873846512287855}, {"id": 2110, "seek": 16724, "start": 7268.76, "end": 7273.12, "text": " you've got like like Mamba and and attention heads in the same layer, sometimes alternating", "tokens": [51440, 291, 600, 658, 411, 411, 376, 23337, 293, 293, 3202, 8050, 294, 264, 912, 4583, 11, 2171, 40062, 51658], "temperature": 0.0, "avg_logprob": -0.2246688491354386, "compression_ratio": 1.7110389610389611, "no_speech_prob": 0.0023873846512287855}, {"id": 2111, "seek": 19312, "start": 7273.12, "end": 7277.92, "text": " Mamba attention, Mamba attention, all kinds of variants, you know, expect Mamba 3 to start", "tokens": [50364, 376, 23337, 3202, 11, 376, 23337, 3202, 11, 439, 3685, 295, 21669, 11, 291, 458, 11, 2066, 376, 23337, 805, 281, 722, 50604], "temperature": 0.0, "avg_logprob": -0.21475483803716425, "compression_ratio": 1.6341463414634145, "no_speech_prob": 0.017840342596173286}, {"id": 2112, "seek": 19312, "start": 7277.92, "end": 7280.88, "text": " getting slotted in in that whole mix.", "tokens": [50604, 1242, 1061, 11252, 294, 294, 300, 1379, 2890, 13, 50752], "temperature": 0.0, "avg_logprob": -0.21475483803716425, "compression_ratio": 1.6341463414634145, "no_speech_prob": 0.017840342596173286}, {"id": 2113, "seek": 19312, "start": 7280.88, "end": 7285.48, "text": " I mean, this is a really interesting development with some important new efficiency gains for", "tokens": [50752, 286, 914, 11, 341, 307, 257, 534, 1880, 3250, 365, 512, 1021, 777, 10493, 16823, 337, 50982], "temperature": 0.0, "avg_logprob": -0.21475483803716425, "compression_ratio": 1.6341463414634145, "no_speech_prob": 0.017840342596173286}, {"id": 2114, "seek": 19312, "start": 7285.48, "end": 7286.96, "text": " anybody who wants to run it.", "tokens": [50982, 4472, 567, 2738, 281, 1190, 309, 13, 51056], "temperature": 0.0, "avg_logprob": -0.21475483803716425, "compression_ratio": 1.6341463414634145, "no_speech_prob": 0.017840342596173286}, {"id": 2115, "seek": 19312, "start": 7286.96, "end": 7291.24, "text": " I would expect that this will start to get taken up pretty quickly and worth keeping an", "tokens": [51056, 286, 576, 2066, 300, 341, 486, 722, 281, 483, 2726, 493, 1238, 2661, 293, 3163, 5145, 364, 51270], "temperature": 0.0, "avg_logprob": -0.21475483803716425, "compression_ratio": 1.6341463414634145, "no_speech_prob": 0.017840342596173286}, {"id": 2116, "seek": 19312, "start": 7291.24, "end": 7292.24, "text": " eye on.", "tokens": [51270, 3313, 322, 13, 51320], "temperature": 0.0, "avg_logprob": -0.21475483803716425, "compression_ratio": 1.6341463414634145, "no_speech_prob": 0.017840342596173286}, {"id": 2117, "seek": 19312, "start": 7292.24, "end": 7293.24, "text": " So there we have it.", "tokens": [51320, 407, 456, 321, 362, 309, 13, 51370], "temperature": 0.0, "avg_logprob": -0.21475483803716425, "compression_ratio": 1.6341463414634145, "no_speech_prob": 0.017840342596173286}, {"id": 2118, "seek": 19312, "start": 7293.24, "end": 7295.8, "text": " That's the last of the two papers I wanted to cover.", "tokens": [51370, 663, 311, 264, 1036, 295, 264, 732, 10577, 286, 1415, 281, 2060, 13, 51498], "temperature": 0.0, "avg_logprob": -0.21475483803716425, "compression_ratio": 1.6341463414634145, "no_speech_prob": 0.017840342596173286}, {"id": 2119, "seek": 19312, "start": 7295.8, "end": 7297.48, "text": " Hopefully I haven't boredied a tears.", "tokens": [51498, 10429, 286, 2378, 380, 13521, 1091, 257, 10462, 13, 51582], "temperature": 0.0, "avg_logprob": -0.21475483803716425, "compression_ratio": 1.6341463414634145, "no_speech_prob": 0.017840342596173286}, {"id": 2120, "seek": 19312, "start": 7297.48, "end": 7299.12, "text": " It was pretty damn technical.", "tokens": [51582, 467, 390, 1238, 8151, 6191, 13, 51664], "temperature": 0.0, "avg_logprob": -0.21475483803716425, "compression_ratio": 1.6341463414634145, "no_speech_prob": 0.017840342596173286}, {"id": 2121, "seek": 19312, "start": 7299.12, "end": 7302.48, "text": " So we'll let, I guess, let Andre take it away.", "tokens": [51664, 407, 321, 603, 718, 11, 286, 2041, 11, 718, 20667, 747, 309, 1314, 13, 51832], "temperature": 0.0, "avg_logprob": -0.21475483803716425, "compression_ratio": 1.6341463414634145, "no_speech_prob": 0.017840342596173286}, {"id": 2122, "seek": 22248, "start": 7302.48, "end": 7306.48, "text": " Thank you so much for listening to this week's episode of last week in AI.", "tokens": [50364, 1044, 291, 370, 709, 337, 4764, 281, 341, 1243, 311, 3500, 295, 1036, 1243, 294, 7318, 13, 50564], "temperature": 0.0, "avg_logprob": -0.25424315340547676, "compression_ratio": 1.527027027027027, "no_speech_prob": 0.12824097275733948}, {"id": 2123, "seek": 22248, "start": 7306.48, "end": 7311.04, "text": " You can find the articles we discussed here today and subscribe to the newsletter at last", "tokens": [50564, 509, 393, 915, 264, 11290, 321, 7152, 510, 965, 293, 3022, 281, 264, 26469, 412, 1036, 50792], "temperature": 0.0, "avg_logprob": -0.25424315340547676, "compression_ratio": 1.527027027027027, "no_speech_prob": 0.12824097275733948}, {"id": 2124, "seek": 22248, "start": 7311.04, "end": 7313.16, "text": " week in that AI.", "tokens": [50792, 1243, 294, 300, 7318, 13, 50898], "temperature": 0.0, "avg_logprob": -0.25424315340547676, "compression_ratio": 1.527027027027027, "no_speech_prob": 0.12824097275733948}, {"id": 2125, "seek": 22248, "start": 7313.16, "end": 7318.96, "text": " We always appreciate you commenting or viewing us on Apple podcasts, share it with your", "tokens": [50898, 492, 1009, 4449, 291, 29590, 420, 17480, 505, 322, 6373, 24045, 11, 2073, 309, 365, 428, 51188], "temperature": 0.0, "avg_logprob": -0.25424315340547676, "compression_ratio": 1.527027027027027, "no_speech_prob": 0.12824097275733948}, {"id": 2126, "seek": 22248, "start": 7318.96, "end": 7320.28, "text": " friends.", "tokens": [51188, 1855, 13, 51254], "temperature": 0.0, "avg_logprob": -0.25424315340547676, "compression_ratio": 1.527027027027027, "no_speech_prob": 0.12824097275733948}, {"id": 2127, "seek": 22248, "start": 7320.28, "end": 7324.68, "text": " But more of anything, please do keep tuning in week to week.", "tokens": [51254, 583, 544, 295, 1340, 11, 1767, 360, 1066, 15164, 294, 1243, 281, 1243, 13, 51474], "temperature": 0.0, "avg_logprob": -0.25424315340547676, "compression_ratio": 1.527027027027027, "no_speech_prob": 0.12824097275733948}, {"id": 2128, "seek": 25248, "start": 7332.48, "end": 7352.12, "text": " Do you need, do you need, when the AI begins, begins, begins, it's time to break.", "tokens": [50364, 1144, 291, 643, 11, 360, 291, 643, 11, 562, 264, 7318, 7338, 11, 7338, 11, 7338, 11, 309, 311, 565, 281, 1821, 13, 51346], "temperature": 0.0, "avg_logprob": -0.6759747036716395, "compression_ratio": 1.4330708661417322, "no_speech_prob": 0.8241891860961914}, {"id": 2129, "seek": 25248, "start": 7352.12, "end": 7353.12, "text": " Break it down.", "tokens": [51346, 16925, 309, 760, 13, 51396], "temperature": 0.0, "avg_logprob": -0.6759747036716395, "compression_ratio": 1.4330708661417322, "no_speech_prob": 0.8241891860961914}, {"id": 2130, "seek": 25248, "start": 7353.12, "end": 7360.64, "text": " Last week in AI coming take a ride, get the load down on tech, and let it slide, last", "tokens": [51396, 5264, 1243, 294, 7318, 1348, 747, 257, 5077, 11, 483, 264, 3677, 760, 322, 7553, 11, 293, 718, 309, 4137, 11, 1036, 51772], "temperature": 0.0, "avg_logprob": -0.6759747036716395, "compression_ratio": 1.4330708661417322, "no_speech_prob": 0.8241891860961914}, {"id": 2131, "seek": 28064, "start": 7360.64, "end": 7367.2, "text": " week in AI coming take a ride, all the ads for the street, the ads reaching high,", "tokens": [50364, 1243, 294, 7318, 1348, 747, 257, 5077, 11, 439, 264, 10342, 337, 264, 4838, 11, 264, 10342, 9906, 1090, 11, 50692], "temperature": 0.6, "avg_logprob": -0.8603197019927356, "compression_ratio": 1.5517241379310345, "no_speech_prob": 0.7835240364074707}, {"id": 2132, "seek": 28064, "start": 7367.2, "end": 7373.52, "text": " glue attack emergent, purchase surgeon flight, from the ads to the streets, the ads reaching", "tokens": [50692, 8998, 2690, 4345, 6930, 11, 8110, 22913, 7018, 11, 490, 264, 10342, 281, 264, 8481, 11, 264, 10342, 9906, 51008], "temperature": 0.6, "avg_logprob": -0.8603197019927356, "compression_ratio": 1.5517241379310345, "no_speech_prob": 0.7835240364074707}, {"id": 2133, "seek": 28064, "start": 7373.52, "end": 7374.52, "text": " high.", "tokens": [51008, 1090, 13, 51058], "temperature": 0.6, "avg_logprob": -0.8603197019927356, "compression_ratio": 1.5517241379310345, "no_speech_prob": 0.7835240364074707}, {"id": 2134, "seek": 29452, "start": 7374.52, "end": 7380.68, "text": " with the shipment of the future fees, building up, building up your latest release,", "tokens": [50364, 365, 264, 49991, 295, 264, 2027, 13370, 11, 2390, 493, 11, 2390, 493, 428, 6792, 4374, 11, 50672], "temperature": 0.0, "avg_logprob": -0.8638331864742522, "compression_ratio": 1.4180327868852458, "no_speech_prob": 0.6912768483161926}, {"id": 2135, "seek": 29452, "start": 7380.68, "end": 7387.72, "text": " no clasps can pay eye comments that go right, get the low down on tech, and let it slide.", "tokens": [50672, 572, 596, 296, 1878, 393, 1689, 3313, 3053, 300, 352, 558, 11, 483, 264, 2295, 760, 322, 7553, 11, 293, 718, 309, 4137, 13, 51024], "temperature": 0.0, "avg_logprob": -0.8638331864742522, "compression_ratio": 1.4180327868852458, "no_speech_prob": 0.6912768483161926}, {"id": 2136, "seek": 30772, "start": 7387.72, "end": 7393.4, "text": " As we pay eye comments that go right, let it slide through the streets,", "tokens": [50364, 1018, 321, 1689, 3313, 3053, 300, 352, 558, 11, 718, 309, 4137, 807, 264, 8481, 11, 50648], "temperature": 0.0, "avg_logprob": -0.7570478672883949, "compression_ratio": 1.3153846153846154, "no_speech_prob": 0.38400211930274963}, {"id": 2137, "seek": 30772, "start": 7393.4, "end": 7403.4, "text": " ay, I've reached in high.", "tokens": [50648, 7494, 11, 286, 600, 6488, 294, 1090, 13, 51148], "temperature": 0.0, "avg_logprob": -0.7570478672883949, "compression_ratio": 1.3153846153846154, "no_speech_prob": 0.38400211930274963}, {"id": 2138, "seek": 30772, "start": 7408.68, "end": 7414.12, "text": " From the drone that's to robot, the headlines pop, made in driven dreams,", "tokens": [51412, 3358, 264, 13852, 300, 311, 281, 7881, 11, 264, 23867, 1665, 11, 1027, 294, 9555, 7505, 11, 51684], "temperature": 0.0, "avg_logprob": -0.7570478672883949, "compression_ratio": 1.3153846153846154, "no_speech_prob": 0.38400211930274963}, {"id": 2139, "seek": 33412, "start": 7414.12, "end": 7420.76, "text": " they just don't stop, every breakthrough, every code unwritten, on the edge of change,", "tokens": [50364, 436, 445, 500, 380, 1590, 11, 633, 22397, 11, 633, 3089, 517, 26859, 11, 322, 264, 4691, 295, 1319, 11, 50696], "temperature": 0.0, "avg_logprob": -0.3379925489425659, "compression_ratio": 1.4428571428571428, "no_speech_prob": 0.18082870543003082}, {"id": 2140, "seek": 33412, "start": 7420.76, "end": 7425.8, "text": " we're excited, we're smitten, from machine learning marvels to coding kings,", "tokens": [50696, 321, 434, 2919, 11, 321, 434, 899, 2987, 11, 490, 3479, 2539, 23893, 82, 281, 17720, 21581, 11, 50948], "temperature": 0.0, "avg_logprob": -0.3379925489425659, "compression_ratio": 1.4428571428571428, "no_speech_prob": 0.18082870543003082}, {"id": 2141, "seek": 33412, "start": 7426.68, "end": 7435.48, "text": " futures unfolding, see what it brings.", "tokens": [50992, 26071, 44586, 11, 536, 437, 309, 5607, 13, 51432], "temperature": 0.0, "avg_logprob": -0.3379925489425659, "compression_ratio": 1.4428571428571428, "no_speech_prob": 0.18082870543003082}], "language": "en"}