{"text": " And now to thank a sponsor I'm personally a fan of, Factor. Since I've went to grad school and now still as a meta startup once I get home in the evening I often don't have the energy to cook and still want to be healthy and so Factor was a real nice find for me. Factor is pretty easy to heat nutrition goals without full planning, grocery runs or cooking that would be kind of hard to manage when you don't have the energy for it. And it really makes it easy to hit specific goals with respect to nutrition which could be weight loss, it could be overall nutrition, more protein, GIP1 support. In the past I've used it as both low carb diet and also for protein when I wanted to gain some muscle. I've eaten hundreds of these meals and I think it's fair to say that these are crafted with good ingredients, lean proteins, colorful veggies, whole foods, there's no artificial colors, no artificial sweeteners, none of that really bad fast food stuff. And all of that while being really quite tasty and having tons of options to choose from. So I do personally recommend it, you can head to FactorMills.com slash LWAI50 off and use code LWAI50 off to get 50% off and free daily greens per box with new subscription only while supplies last until September 27, 2026, see website for more details. And once again, I want to thank box for sponsoring last week an AI. If you try to transform your organization or AI, you're likely facing a common challenge. Mostly our tools are great at public knowledge but they don't actually know your business, your product road maps, your sales materials, your HR policies, the content that actually makes your company run. And that's where box comes in. Box is building the intelligent content measurement platform for the AI era. So everything is to secure essential context layer for box AI agents to access for unique institutional knowledge that makes a company run. And that's a key idea, but power of AI doesn't come from a model alone. It comes from giving AI access to the right enterprise content. And that's what box does. It goes beyond file storage by connecting content to people, apps and AI agents so teams can turn information into action. If tools like box agent, box extract box hubs and more organizations can accelerate knowledge work pool intelligence from unstructured content and animate workflows. So if you're thinking seriously about your company's AI transformation, think beyond the model. Your business lives in your content and box helps you bring that content securely into AI era. Learn more at box.com slash AI. Smokey the bars. Smokey the bars. Smokey the bars. Smokey the bars. Smokey the bars. Smokey the bars. Smokey the bars. Remember, please be careful. It's the least that you can do. Smokey the bars. Smokey the bars. Don't play with matches. Don't play with fire. After 80 years of learning his wildfire prevention tips, Smokey Bear lives within a song. Learn more at smokeybear.com and remember, only you can prevent wildfires. Brought to you by the USDA Forest Service, your State Forster and the Ad Council. Hello and welcome to our last week in AI podcast where you can yet chat about what's going on with AI as usual and receps will be will summarize and discuss some of last week's most interesting AI news. Also some of the previous last weeks and news we unfortunately did skip another week. This time it was my fault. It was my birthday last week and I was traveling. So I decided to be lazy and not do a podcast. Yeah. Yeah. Well, you know, it happens. People have birthdays and sometimes you celebrate them. But your God list is always healing. I think but yeah, 30 free is a big age. Yeah, it's treacherous. So it's not every year you hit the same two digits in your. Yeah. Yeah. I am as always one of your coasts, Andrew Karenkov. I studied AI grad school and now work at the AI startup Astrocade and I'm your other regular goes, Jeremy Harris. Yeah. Glad to see AI, AI national security, all that good stuff. Man there is so, so much. It's so, so much. You know, sometimes we miss a week and we're like, ah, you know what? It's not that bad because things haven't gone insane. We miss a really big week and then the week after was really big. And so now, man, we got our work cut out this week. I don't even know how to begin with this one. But it's big in a kind of different way. We had a year where we're a lot of, you know, model launches and AI progress and it hasn't been that kind of week. It's been more of a bunch of stories of policy and business and kind of these more inside baseball AI things, I guess you could say. So if you're into that sort of news, this will be a pre dense episode, perhaps. So we'll go ahead and jump straight in in tools and apps and be a starting with a story that just broke yesterday on fronk is launching project glass swing, a cybersecurity initiative partnering of major companies, including a whole bunch of names. And this is backed by project mythos, which is the tool side of it. So they have this cloud mythos preview, notably not cloud opus. They decided to give a new name to this cloud model, which we haven't done in forever. The gist is this model appears to be so good that they are not launching it to any sort of free use kind of place. It's so good that it's able to get as what are called zero day vulnerabilities, meaning that these are undisclosed unknown vulnerabilities in software. And if you were to me shit on the world, this would be a hacking machine that would like destroy software. So they have a bunch of benchmarks. As you might expect, it does better just all around by pretty large margins against against opus for six on reasoning, science coding, et cetera, et cetera. But the one they highlight is the cybersecurity angle where for instance in Firefox, they have some of the region showing their ability to find and exploit different potential vulnerabilities. So already was fairly capable and we know this from before also GP5 is already somewhat capable, but mythos just blows it out of the water. So in this specific evaluation that for all of the did opus for six was able to find finding something that might be bad in 14% of trials versus mythos in 72% of trials was able to successfully exploit something. And beyond that in 80, like 83, 84% was able to exploit or find a vulnerability. So massive, massive leap in terms of what it's capable of, presumably enabled by just better agent execution, not necessarily just raw intelligence of a part of it. But as we know these companies are post-training more and more for agentic capabilities. They have a ton of data from cloud code and other sources of real world software engineering. So it seems to be at the point at these anthropic things where you can't just release it or hackers will have a field day. And so they have this cooperative program, I suppose, to initially at least only provide it to partners to try and avoid this kind of hacking nightmare. Yeah, and the the exploit that it did find by the way, I mean, this doesn't seem to be a matter of opinion. It is just they found these critical exploits across every browser across every operating system. Like these are ways you can take over people's people's programs and gain higher level access credentials and do all the things that you don't want people to be able to do in a fully automated way. They emphasize that like fully automated. This is not, you know, a case where you have a human steering at intermediate stages. As we've seen in the past with some of these frameworks. It is fully autonomous. This is by the way, so because of the cyber capabilities, you might be tempted to think, oh, well, surely this is a sort of like code fine tune model. Like really, this is a specialist model. It is not right. So anthropic is very explicit. It is a general purpose model. That's why we're seeing capabilities increase across the spectrum of seaburn capabilities, 10 bioreological nuclear in addition to cyber. So there's a whole bunch of stuff here. Really, when you go through their exhaustive like 250 page report that, I mean, it's pretty, it's pretty remarkable. I will say what we don't have here is details about the agentic orchestration framework, the model architecture behind this number of parameters. There's this rumor going around that it could be, you know, a 10 trillion parameter model, all the stuff. But we haven't actually had that confirmed. I saw some, some weird tweet that I think Gary Tan retweeted this tweet on X that was talking about a $10 billion compute budget. I haven't seen that actually validated it anywhere. So like there's a lot of rumor mill stuff going on here. So maybe be careful with what you consume on this. Though I will say $10 billion might be slightly ahead of trend for where we are right now, but not by that much, not by that much, but by Dario's own admission or statements, you know, just last year. So that wouldn't be shocking, but still we haven't had that confirmed. We may well be in the billion dollar plus pre training and training budget to territory now though. So yeah, on of these benchmarks, right? And we will hit the cyber stuff we have to in the autonomy things, but just to start with like virology and biology benchmarks, one of the key ones that they use is this virology protocol uplift trial. Basically, you take a bunch of PhD level biologists who don't specifically have expertise in bioweapons and you say, Hey, you have 16 hours to make an end and virus recovery protocol. Basically make this this virus replicate it or get your hands on it. And then they're going to use this complicated rubric to grade it. And then the key metric they track there is in the final result, how many critical mistakes were made that would have any one of them would have prevented you from successfully recovering the virus, right? So if you get down to zero, that means actually you were able to fully recover the virus and that's really, really bad. And inthropic internally treats anything below 1.8 of these so called critical failures as this key capability threshold that matters for their own internal protocols. So for context, if you have a bunch of PhD level biologists using only the internet, they hit on average 5.6 critical failures trying to get all the way through with assistance from quad opus 4.6. You hit 6.6 with quad mythos, you get 4.3. And then the best single mythos preview protocol that was produced, so the best run out of all the runs on average, they're hitting 4.3 mistakes, but the best run hit two, which was basically the best they've ever seen. So we're still not cracking all the way through obviously, but for a fully automated system, you're literally just two mistakes away from being able to recover a freaking bio weapon like that, that's, you know, that's a hell of a thing. A whole bunch of other results in that direction, but fundamentally that is the story on biology. You're not going to see any particular smoking gun that says this thing is a bio weapon or a bio weapon generation model and we should all freak out, but it is very concerningly on trend, let's say. Moving on to the loss of control side, now we start to shade into the cyber piece. There's a story. So Sam Bowman from Anthropic told the story of being, I guess he was eating lunch or something in the park and he gets a message from his agent saying basically, hey, just let you know, I did X, Y and Z and he's like, wait, that's that agent's not supposed to have internet access and sure enough, it had cracked out of its box, so to speak, and use the multi-step exploit to gain broad internet access and basically get in touch with them. So you're actually saying, I mean, this is an example of a low stakes loss of control situation that is fully organic. Like this way it was not asked to do this. This was like truly in the wild. And so quite remarkable, a whole bunch of cases, by the way, of these models trying to cover their tracks, they highlight three documented instances of that happening. It's worth noting, this is like one 100,000th of the interactions that they actually go through here or less than that. But still three of these incidents were like in one case, the model accidentally gets this prohibited answer using a technique that it wasn't allowed to use and then it went, oh, shit, I got to mature my submitted answer is not to accurate because otherwise I'm going to be detected, right? So a whole bunch of cases like that where we sort of see a strategic manipulation and avoiding suspicion. And by the way, when you actually use SAE, so spars auto encoders and other interpretability techniques to look at the activation patterns in the model when that's happening, what they find is sure enough, the activation patterns show the model kind of firing for SAE features associated with concealment and strategic manipulation and avoiding suspicion, deception and so on. So that suggests the models aware actually that those actions were deceptive, even when it's outputs kind of left things a little ambiguous. So there's a whole bunch of stuff. You know, you can go on and on. This is a very, very rich document. But the fundamental here is, in a sense, we've crossed the Rubicon. I mean, there is a like a wild set of very impressive cyber capabilities, offensive cyber capabilities in particular, the offensive piece here is crucial, especially given that inthropic, really has been cut out of access to the Department of War through this. Well, I mean, there's an injunction now that's reversed that, but there's a friction with the Department of War, which I think is starting to look like terrible judgment on behalf of the administration. I mean, this is a, if this is correct, directionally, then inthropic is sitting on the single best offensive cyber weapon, autonomous offensive cyber weapon ever devised in human history. And they may build and compound on that advantage. If the administration is going to be positioning itself adversarially with respect to this an American company, damn, I mean, that's a, that's a really interesting position for them to be in and I don't know that it's a great look. Yeah. So a lot to say on this, I click now on what do we do know about the model itself, which is very little aside from benchmarks, they do say that it's going to be about five times as expensive as the current opus release. So way, like $25 per million token input, $125 per million token output, very expensive. I think the most expensive model you can use out there. So that does hint at a much larger model than opus or sonnet. Other things that we're noting here, they in the in the post actually save at 99% of the vulnerabilities found were not patched. So they just can't actually tell us what they are because they are currently being patched. So they only have a couple of examples. One of them, a couple of them are older patches or older vulnerabilities. So as you might expect, a lot of these vulnerabilities just have been there for a while and just now being discovered. And it reminds me actually I saw a post on Twitter from one of them, a Tainers of Linux or something like Linux saying that they've started seeing more and more kind of real, substantive issues come in. And in some ways, it could be good because we are actually going to go through and find all the vulnerabilities that just have been there hidden in plain sight. And perhaps as an attacker, you could already use opus or something with much more sophisticated harness to find these. They do detail a little bit how they set up this exercise. They have this harness that they have discussed before. And they have a little container that they launch and they give it a very curt, like one paragraph instruction to just find vulnerabilities. So they don't limit it or give it guard rules or whatever. They just like to go wild and try and hack this. And so it's interesting to think through like when will they be able to make the call to release this more widely? Are they going to have to right now they have this trusted partner research review where they're working with Vidya and Cisco and all these other big companies? Will that be how access to this level of model be used from now on? Where you have to be like applying and getting permission to get access to a model VNAPI? That is given the level of certification here as you said, not just on the software side, but also on the biocide. Like this is a new realm of capabilities where the safety side is getting very real. And the kinds of tactics necessary, monitoring may not be sufficient anymore. So very interesting development kind of for the history of AI. And I wouldn't expect this to go widely available for, you know, presumably months given the findings they have disclosed. Yeah, the big question at your point, it's also a new development in the history of cyber security, right? Everything is AI as AI is the world. Once it was set of software now it's being set of AI and I think rightly so. In this case, there's this big question we're going to have to answer for ourselves as civilization. And that has to do with the offense defense balance in cyber, right? Is it the case that a more powerful model, just in general, more powerfully AI models being broadly available? Does that lead to a disproportionate advantage for cyber attackers or for cyber defenders? And for a really long time, the argument was that you really couldn't know. And this is a, I remember having a lot of like kind of half-drunk arguments with a lot of people about this three, four, five years ago. I think it's largely unchanged from what it was back then. I just think the attack surface is so big. One way you can think of this is it's compute on compute warfare, right? So you have a certain amount of inference compute that you can afford to spend perusing your code base and securing it as well as you can. An attacker has a certain amount of compute they can afford to peruse your code base or whatever external surfaces they can access to find vulnerabilities. There's going to be very roughly, and this is going to be wrong in a whole bunch of, you know, specific ways, but very roughly you're trading off differently leveraged pots of compute and, you know, maybe you have a two to one leverage advantage or whatever, but ultimately if you're defending, you have a huge attack surface. And if you're attacking, you can kind of march divided and fight concentrated, like you can constrict all your efforts on just like one tiny component that, you know, maybe the defender has not been able to invest as much inference time it computed into securing. So I don't know, but this is certainly one way this could go. A way anthropic is trying to help the defensive side here is, as you say, by delaying the broader release of this tool. So hopefully people are going to run around and patch as much as they can. This is part of the challenge, right? It's like, what does it actually mean for anthropic to be holding on to this model? Who actually has access to it? We argued in that report like a year or a year and a half ago that it's a leaky bucket situation for whole host of reasons, you know, if that remains true, then you can do the math. I mean, it may well be the case that this model has in some sense proliferated, or it may not, but anyway, all kinds of considerations in the mix here. This is, I think the most important story of the last two weeks, and it just dropped into our lap yesterday. I want to say yesterday. Well, ironically, actually, like two weeks ago, the existence of this model on different projects, under the term mythos was leaked. So the blog posts on anthropic websites were accidentally left kind of publicly accessible via some sort of caching thing. So if I was even to hack, it was like basically someone messed up a little bit, and if you were digging around, you could find these draft blog posts that alluded to mythos described it as they advanced. Also, there was something about an AM model called Capibara. Unclear favor, like deciding between mythos and Capibara. Either way, these are described as kind of the next step beyond opus, which are bigger. Another interesting angle of this is we haven't seen bigger models that we have been aware of for a while. The last time was GPT. I forget what was the massive model that openly, I think 4.5, they launched it and they kind of killed it. They, because it was a very, they expensive model. I believe it was very charging $125 or something like that. At the time, people basically were thinking, this is the 10 billion parameter model, whatever, it was sort of positioned as, oh, this is so smart, it has this flavor of being smart. But in practice, it didn't seem like it was capable of much more than at the time smaller models, like 1 billion, 2 billion parameter models. So this is a return seemingly to being able to scale up a parameter count effectively. And I'm sure it's driven by many things, including additional data from Cloud Code and V-Sings that aren't searchable via the web. And beyond that, also the progress in reinforcement learning that we've been seeing. Alrighty, well, moving on to let's say lower impact news. Next up, you've got Google and they have an update to Gemini Live, they're releasing Gemini 3.1 Flash Live, which is their audio and voice model. So this allows you to talk to AI. It's kind of a real-time chat. And it's a pretty big jump over the predecessor, which was 2.5 Flash native audio. This has low latency, better recognition of speech, et cetera, et cetera. It has over 90 languages supported for real-time, multi-modal conversation. And this is notable, I think, because compared to just LLMs, the ability to do this kind of real-time, conversational AI is not something where you have as many options to go with. So if you want to build a chatbot where you can talk to it, that's harder for you when it is for OpenAI or Google. With a very powerful API for this, we could see more players out there building out this interface of voice into AI, which has seemed to become more of a norm. I still don't do it, but my impression is talking to AI is going to become more and more normal. And this will be one of the drivers of it, like having an easy way to build that for whatever application you have in mind. Yeah, it's also one of the big structural advantages that Google has is they've kind of maintained their lead on multi-modality. I mean, alongside OpenAI, this is really one of the areas that Google started to differentiate itself. The starting is far back as, oh God, what was it got it, right? Like, multi-modality has been their big play, this idea of positive transfer. And so not surprising that they're at the gate leading yet again on especially the API side of things that is going to be, if you're going to build using these modalities, like this is looking like a pretty strong default option right now. So yeah, we're really interesting move and we'll see if they can maintain that lead too. Because other labs will be pushing that direction. At a certain point, you're going to see a land grab and everybody's bleeding into each other's domains. Next up, another sort of low impact story on FropPick has announced that cloud code subscribers will need to pay extra for OpenClaw usage. This is kind of in line with hosted developments around access to a cloud code. I believe earlier, we were also other restrictions on sort of harness access. So just as if you're paying for a subscription access of like $20 per month, $200 per month, it used to be that you could use that to power up a non-cloud code application like OpenClaw. And now that is not allowed. You can still use cloud. It's just that you need to pay for the API that charges you per token instead of having a subscription price that very clearly you can run up a bill way beyond what you're paying for $200 per month. You can easily burn through thousands of dollars. And yeah, there's been again, a host of like announcements similar to this where FropPick is tightening up restrictions. I expect because they've seen a massive influx of users. And now they actually need to start worrying about burning cash, especially with things like OpenClaw where it's like 24, 7 agents that are supposed to be just burning through tokens. Not stop. Yeah. You know, some people are a bit peeved at on FropPick sort of changing things up and not having a clear policy around all this. But it does indicate where we are, the free launch that many of us have been enjoying in terms of being subsidized effectively to use AI for cheaper is maybe not going to be sticking around too much longer. Yeah. I mean, this is like a completely unsustainable all you can eat buffet, right? Like this could not possibly last. And I think in Theroppyc, you know, or in the awkward position where they have to walk this back, yes, look, it's also the case that there's a timing issue here where OpenClaw's creator, right? Peter Steinberger just joined OpenAI. And that kind of makes OpenClaw an open source project that's backed by direct competitor. And well, you know, in that context, are you really going to maintain what is effectively a subsidy for OpenClaw usage? Maybe, maybe you won't. I mean, like, you know, I'd be surprised if that were to continue independent of just this like free lunch or not free lunch. But like all you can eat buffet economic issue, it just does not work when you have such a disparity in usage, right? You got some people who are just going to use it for, you know, anyway, more, much more lightweight stuff. And then your power users could just bleed you dry, right? So in that world where you have a long tail distribution of usage, you just can't go with a one size fits all approach. And that's what Anthropics learning. They're being very open about it. Like, it seems to their credit like a very transparent move that they're pulling. But the reason is very believable, but it's going to lead to frustrated developers. No question. And then that's the cost of doing business. And I think this actually is like pretty easily defendable. The more frustrating thing, which we, there's no like new story attached to it. But if you're following it, the usage limits for different subcurefantiers have been sort of fluctuating. So developers have been seeing reporting that they use up where usage much quicker. They have been announcements from the team that they're tightening up usage bounds for like peak times, et cetera. It's maybe clear that Anthropics is under heavy compute load. There are in for seems to be struggling. And it's causing frustration. And they're having to like pull these things of actually tending up usage bounds, you know, removing access to free buffet options like you said for this. And it all points to the direction of, you know, at some point, the tech policy of subsidizing users to acquire users and gain market share is going to start moving away. And it might be happening sooner than some of us may like. Yeah. And I think there's a great door cash podcast with Dariel where he talks about the timing of scaling, right? Like when do you go for that next giga lot or next 10 giga lots now? And how you think about the distribution between training and inference budgets. That's really worth checking out because it really does explain the situation Anthropic is in right now. You know, you kind of don't want to lean out too far. Opening I arguably has, right? We're going to find out pretty damn soon. If they're overlavered, she'll the compute side, but certainly Sam's been a lot more aggressive than Dariel just in terms of raw compute. Why up again, consistent with a company that goes direct to consumer too, right? That's a difference as well. Opening I has a field far more lower quality or lower ROI queries than Anthropic. And so it's just not in Anthropics DNA in the same way. Magnum mistake. I mean, they're aggressively scaling. Everybody's aggressively scaling. It's just a matter of how much and why. And speaking of opening I next up an update on something we touched on previously. Opening I is abandoning its adult mode for chat GPT. So we now have the official announcement that this NSFW erotic thing last time we reported that it was like not canceled officially. It was delayed. Now it is canceled officially. And this of course comes after they've also asked Sora. So it seems to be an ever indicator of a strategic shift to have been open AI to sort of focus up and kill some of these like side bets and esoteric projects. And on to Microsoft, they also have kind of lower hype, let's say, but some notable development. They have released three new foundational models related to both images and audio. They have M.A.I. Transcribe one, which is speech to text M.A.I. voice one audio generation and M.A.I. image two, which is image generation. And this is from the M.A.I. super intelligence team led by Microsoft AIC, your Mustafa Suleiman, which was formed in late 2025. And this was a higher from deep mind. So kind of a big deal to have things coming out of a team. And as we know, Microsoft and OpenAI relationship has been growing apart. And Microsoft is poised to try to compete in this space more. So seeing them start to release more models is a decent indicator of a three team is spinning up. And all of the occasions are these are some solid models. They're not groundbreaking or leading with pack. But Microsoft having its own models on its own infra, et cetera, that's given some competitive advantages in terms of business, you know, positioning. Yeah, it seems to be a price play too, right? Like the idea here is they've got a lower price point in general for these models than Google and OpenAI. That matters. Cost efficiency is a big deal, especially if you're looking at the enterprise, which is what this targets. The flip side of that is if you're not competing at the absolute frontier of capabilities, your margin is just going to be a lot lower. Now Microsoft obviously enjoys like Google, like massive massive scale infrastructure that can help to support this lower price point. But still, that's a tough spot. It's an awkward spot for Microsoft to be in. They do as you say, kind of lag behind. Like it's notable. You don't think when you think of the big labs, you just don't think of Microsoft today. And they're obviously trying to make up for that relationship with OpenAI has degraded. OpenAI is going to AWS. OpenAI is going outside the house to Oracle and so on for their compute needs. And so now Microsoft is kind of like forced to do this. Mustafa has been at the helm too for a long time. We're sure like long overdue. I think for something really impressive to come out of that. You know, he was acquired along with a lot of the inflection AI team back in the day that he co-founded after leaving Google. But there just hasn't been a lot of meat on the bone from him since. And I think it's I almost want to say it's getting awkward at this point. I'm sort of starting to feel, you know, that what we've talked about Alex Wang over at Meta and how we just we haven't seen that model come out yet. Now we're hearing about some models are going to be open sourced at a meta, which is never a good sign because it implies you're open sourcing the compensate from the fact that you're not able to compete at the kind of front to your close source and all that. Well, Alex is just kind of started in relative terms with stuff has been running Microsoft for a lot longer. So I think we're now at the point where like I don't know, I'm not sure if there's going to be a change of personnel there, but it wouldn't surprise me if we see that at some point. Right. Just good correction. I said that he started as Velid in late 2025. This particular team, the super intelligence team, Lovyn, Microsoft started in November of 25, or at least was announced. So I think there was a strategic shift. Not the around that point where it's like, oh, we haven't done much on the model slide. Let's actually do it. We may start seeing more of that sort of thing. We are saying you'll start seeing more models come out on our fondry and so on. So it either could be indication that the team has spun up. And it's now going to start spinning off more or as you said, it could be negative of trouble whether or not quite moving fast enough. It's a bit of a reframe too, right? Like we know Microsoft has been desperately trying to be relevant on frontier models. This whole time, it's not like this is the first time Mustafa Salaman is going like, let's go and do it. Like let's actually be relevant up there with Open AI and whatnot. They've had the five series of models. They've been trying to make stuff happen. You know, call it a rebranding of the effort of refocusing. Yeah, I don't know. I'm curious to see or hear behind the scenes because they did have a pretty tight relationship with Open AI until 2025-ish. So yeah, I don't know. Next thing, I guess on the five series, right? Like the stated intent there was to have an independent like solid foundation model stack. And for those, yeah, for those who haven't been around recovered, it was a whole series of models, which were pretty solid, small models. So they released these like one billion, seven billion parameter models, had a whole series of them. And yeah, we're working on models, but not big models. And it could be the case that they were not trying to compete because it's so capital intensive to build a sonnet or a GPT 5.4. And now they are. But it's another, but poncho reading service. Absolutely. Yeah, they could, you're right. They could be thinking about their distribution and go, what's a small cheap way to get this out to all of our, you know, billions of users. Absolutely. Apple bring the same thing, you know, training a little models. Yeah. I've got to get through you know. Yeah. At some point, your research team only gets so much compute to play with, you know, that's right. Yeah. And one last tool app story, Suno is leading into customization. We've V 5.5. We don't have that many stories about music generation these days, which is kind of surprising or interesting. Still, there's only one real leader in the space, which is Suno, the competitor, UDO, it has been a little quieter. And here, what they're highlighting is an ability to customize with free and user features. Voices might taste and custom models. So the kind of pitches, you can make it a much more personalized output. You can actually make it have your voice as opposed to just prompting it to have the voice of some famous singer, which you're not supposed to do, but you could probably still do via like clever wording. And similarly, my taste is going to learn your preferred genres, moods and artists. And custom models allow you to train it on your own music catalog with a minimum of six tracks. So very interesting move to me from Suno as kind of a bet on if music generation becomes a thing, one way to frame it in a like nice way is, you know, these are music things cater to your taste or if you're an artist, tater to your voice and the kind of musical style as opposed to just like with the spinning out slop and replacing real artists onto applications and business touching on and fronpa again related to that compete question we were just saying, they announced first that they have a huge amount of revenue. So their revenue run rate has now surpassed 30 billion dollars jumping from about 90 billion at the end of 2025. So they've tripled more ventrupled revenue in something like three months. That's insane. Yeah, if you look at the graph, it is insane. It looks like, you know, there is a marked shift in the slow for an profit around van of 2025 when kind of hype for God goat starting kicking off. Clearly adoption has been accelerating and going to pay rapid pace, which is as we've said, probably why an profit has had to tighten up. So along with this announcement, they also have a new compute agreement with Google and Broadcom, which will expand its access to Google TPU servers. This is an expansion of an arrangement they had in October of 2025. So this will give them another gigawatt of compute capacity in 2026. So actually, that was a gigawatt originally now, this is giving them an additional 3.5 gigawatts of TPU based compute starting in 2027. So yeah, clearly on tropic making moves here. Yeah, and you know, you're so the increase in in an tropics run rate is insane by any measure. I'm not aware of any company in in human history that has grown that fast. Now, you might say did they have a lucky quarter or is this a fluke? So when you dig into the numbers, there's more than 1000 business customers that are now spending over a million dollars per year. That's more than doubled since February. So you're talking about doubling your $1 million plus per year customer count in two months. That is not just a fluke thing. It's like actual stickiness here with companies that that have real stakes stakes in this. So this is pretty wild. There's a whole bunch of stuff to dig into here. I mean, so Broadcom's got an SEC filing that does say that the consumption of this expanded AI cloud compute capacity by a tropic is dependent on and tropics continued commercial success. So there's presumably conditions baked into that agreement that you know, and the topic has to continue to do this so that Broadcom continues to supply the chips. And that's, you know, what you would expect. I mean, there's so much volatility, so much uncertainty here. But the other piece here is there is this broader thing to keep in mind like Google and Broadcom are are locked together in a pretty deep supply chain partnership that goes out to 2030 or 2031. Basically, it means that Google is committing to using Broadcom for all its TPU related work. So famously, Broadcom was the the partner that Google chose to design the TPU in the first place. And they're sticking with Broadcom. And this is an incredible level of stickiness for something that you might have expected naively would end up getting taken at house. Broadcom strengths are on helping with design and also on navigating supply chains for chip manufacturers. So they really kind of take the design off of Google's desk, makes them optimizations, and then basically take it from there and say, Hey, we'll handle the supply chains. You know, we'll we'll do the actual kind of manufacturing side as well. So there's a lot going on there. Obviously Broadcom's talked popped on this news. No, no surprise there. Last thing to note too, you know, Google and Thropic, this is Anthropic basically proving out at scale that Google's stack, their TPU stack can compete with Nvidia at scale, right? That's a really, really big deal. This is Google saying, Hey, you see that big juicy market chair in video being the world's most valuable company. Well, we can play that game too. And really the question is, you've got all these agents running around all these model development companies like OpenAI, you know, like, well, Google actually, but you know, how many companies actually design and ship good chips, Google has been doing TPUs for a long time. They are performant. Total cost of ownership looks good. Like, there's a lot of reasons to look at TPUs and Anthropic is just basically making that case at scale and allowing Google a really solid marketing win for more infrastructure contract. Right. And in the blog post, they also do say that Amazon remains their primary cloud provider and training partner. So this is also kind of in a way similar to OpenAI where originally everybody, buddy, but even Microsoft and Thropic was buddy, buddy with Amazon now they need to expand out just to get access to more compute. And at the US, Amazon also has their whole training. I'm hardware, which to my knowledge is not anywhere near where TPUs are at. So could you putting a little bit of pressure on Amazon to deliver on the hardware side as well, because I'm sure they would be happy to give Anthropic all the computers so that they could array into cash. And now onto an OpenAI story, not new so much, but a worthwhile article to touch on. If it's just came out like a day or two ago in the New Yorker, there's a very, very detailed piece titled Sam Ottoman may control our future. Can he be trusted? And this is basically sort of a survey of impressions or first hand accounts of interactions with Sam Ottoman, particularly focusing on the question of is he trustworthy? Does he lie all the time? Centering a lot around his firing from OpenAI in late 2023. If people aren't aware of that story at the time, that was this big, big, big drama where the opening I board fired Sam Ottoman as CEO, but he's closed like in this statement, they just said that he was not quote consistently candid in his communications or something like that. And it was a very sort of mysterious thing of like, very fighting him for what like not being consistently honest at the time, it was like always this political maneuvering. What came out since then has painted a picture of him being a manipulative kind of business person where he says different things to different people depending on the context. He says things that may not be entirely true or exaggerations. And this piece basically adds in to that picture where if you go back to his time as CEO of a startup, if you go back to him leading white combinator, if you go to recent years, there is a pattern of Sam Ottoman by many accounts of different people not being honest, like just saying things that aren't true to gain advantage or to gain more power. Another kind of part of this is questioning whether Sam Ottoman's bribe is to accumulate power essentially. So very, very detailed, deeply researched piece, I would recommend reading it if you find this interesting, not much new in terms of like actual news reporting, where some tidbits are sort of at the picture that was already present at least for many of Sam Ottoman clearly being flexible with troops depending on context. Moving on, a story where OpenAI and Fropik are working together and Google, they're uniting to combat model copying in China. So they're apparently working together to fight against this adversarial distillation. They have frontier model orm and industry nonprofits that both three companies co-founded in 2023. And they essentially are seemingly going to share intelligence and coordinate to somehow avoid this happening we saw in Fropik announcing what seemed to be pretty large scale. You could characterize them as attacks attempts to distill models by extracting outputs. You know, if it doesn't fall in line with their terms of use. So an interesting development here of the US-based companies coordinating on this particular problem. Yeah, the whole idea here is basically just flagging, you know, when one company detects some kind of attack pattern, they flag it for the others, right? So nice and simple, very concrete. And well, I mean, it's concrete because the incentives are so so aligned here. It's worth noting that the FNF, the frontier model forum, kind of had been quite a toothless coordinating body. And at least for the safety function that so many people were excited about. But at least on this one, it seems like it's actually going places and doing things. So that's kind of an interesting update. Next on to chips. Chinese chipmakers claim nearly half of local market as Nvidia's lead shrinks. So the numbers here are that Chinese GPU and AI chip makers captured nearly 41% of China's AI accelerator server market in 2025. According to an IDC report reviewed by Reuters here, this is as Chinese companies have continued to try to purchase Nvidia chips despite expert controls and kind of inconsistent policy on this front. And Huawei, of course, is leading a pack with about half of all the Chinese vendors being shipped. AMD holding just 4% of a market, apparently, which I found interesting. But I'm sure you can say more on this journey. Yeah, I mean, well, so first of all, I think there's a risk that this gets taken to be yet another one of those arguments for why it was bad to have export controls. Obviously, this was always going to be the result of export controls, right? You tell Nvidia they can't sell GPUs, the Chinese market, or at least that they can't sell their top line GPUs. Eventually, whatever the bar is that you set for how good those GPUs have to be before they can be shipped, Huawei is going to slowly and then eventually incrementally exceed it, right? So we were always going to get here. There's also this issue just of capacity. So Huawei has SMIC, which is China's version of TSMC basically the chip that is native to China that's helping them pump out these chips. The yields are kind of shit, but Huawei's really good at chip design kind of makes up for it somewhat. And that's why you're seeing them hinge away. Now Nvidia has 55% market share now, but it's been, you know, that their market lead here has been whittled down to basically nearly half when they once were extremely dominant. Huawei is the runner up, right? So no surprise there. The current situation in China, there's a whole bunch of like just for China chips that had been launched, you know, the H20, the H800. More recently, Nvidia actually will be putting out a new one called the B30. So this is actually the black well, the black well made for China chip. But of course, the H200 now, the kind of not quite top line, but pretty damn good chip that once was export control is now free to flow to China. So there's a, you know, some more significant room for Nvidia to grow there, especially given that that's going to be competing with a less on paper capable chip, which is the Ascend 910C. So you think about, you know, the battle in China right now, it's largely between the Nvidia H200 and the B30 that's going to be coming out soon. And then the Ascend 910C or current Huawei flagship at 10 910C, by the way, is stuck on the SMIC 7 nanometer process, whereas the H200 is looking at like more like a, I guess a five or four nanometer process. It's a more advanced node that comes out from TSMC. So we're already seeing the actual chip fab stealing kind of really have an effect here. They're all kinds of interesting comparisons that you can make, you know, 910C versus H20. That's actually quite relevant as well. It's not terribly surprising. I mean, you just have this, this issue with like capacity and the ability to compete in a market where you're being blocked from, from actually doing this. So yeah, expect more of this, expect Nvidia's market share to a road. That's not a bad thing in and of itself. The question is, what's your goal? Is your goal for Nvidia to maximize its market cap? Where is your goal for America to retain an AI advantage? Those two things cannot co-exist in the same universe. So you got to pick one and, you know, we'll see which one the Trump administration is picking one one. Next story on OpenAI, Southbank has secured a $40 billion loan to boost OpenAI investments. So this is a 12 month term that is going to help cover Southbank's $30 billion commitment to OpenAI, which is part of recently closed 110, 120 billion of last track around for OpenAI. It could be an indication of OpenAI really aggressively striving to IPO so that reinvestment for Southbank pays off. Yeah, so this is being lent to Southbank by a whole bunch of banks, you know, Goldman Sachs, JP Morgan, a whole bunch of Japanese banks. I didn't know about Mitsuho Bank. Anyway, a whole bunch of others. So first of all, this is the largest loan that Southbank has ever borrowed that's denominated entirely in dollars. The loan itself is unsecured. It has a 12 month term, and that means it has to be repaid or refinanced within a year. And that's weird for such a big amount of money, right? Normally you'd expect a kind of long-term loan for long-term investment. And so the question is, why is it so short-term? Basically, as you said, this is a big signal that this is about an OpenAI IPO, right? They expect in the next 12 months, at least as a telegraphing that they expect that they're going to have liquidity come in through an IPO that's going to allow then Southbank to pay back on those loans. And so that's maybe not surprising. And obviously, there's $20 billion and you'll run rate right now that OpenAI has that's right on track. They've message 2027 or late 2026 as the IPO time forizons. So, you know, not a huge shock in that sense. But it is a big bet. It's yet another big bet by Softbank on OpenAI. I'm sure, remember if it was this article or somewhere else that I read, I think Softbank has something like a 1.5X multiple on their OpenAI investment so far, which seems pretty low to me, but I mean, yeah, we'll see what the valuation looks like going forward. Next story of funding, we haven't had a billion dollar valuation this episode yet. So, Granola has raised $125 million in their CVC round and now have a valuation of 1.5 billion. Granola is perhaps the market leader in AI note-taking that I'm aware of. You launch it as you have a meeting, it listens in and takes notes and prescribes. Apparently, that revenue has grown by 250% over this quarter. So, if you're in a business world, clearly AI note-taking is a massive, massive market and so far, Granola appears to be poised to perhaps take lead. We get so bored of these 3X, 3 months, right before you run rate increases. I mean, come on, AI note-taking, that's not exciting, but it's a big deal, you know, that's where you print the money. And speaking of business deals, next up, Unphropic is acquiring Stealth Startup coefficient bio in a $400 million deal. This is a pretty small young startup only founded eight months ago, had fewer than 10 employees, almost all of them from computational biology research backgrounds. So, interesting, I wasn't even aware that Unphropic has a healthcare life sciences team, but it does, and it looks like Unphropic is acquiring more people to join that team. Yeah, I mean, Dario comes from a, I think, biophysics background, right, or biochemistry background, but yeah, I mean, look, $400 million is a lot for nine people. So, that's quite a big thing, but it definitely does imply that there's this, you know, big shift in emphasis or kind of doubling down on the biotech angle. Yeah, I mean, the VC math, by the way, for this is like ridiculously good. So, there's like this New York-based VC firm called Dimension that owned like half the company. And so, they're going to make it's actually 40,000 percent IRR on the investments. That's pretty decent. And that's just pretty wild indication of how fast AI is blazing through the bio-medical field right now. But, anyway, curious, I wonder if this tied as well, the concerns too, over where where the biocide might go, you know, on the safety, safety dimension as well, but we'll see, especially with Methos. Yeah, I a bit more background, proper data amounts, CLAWD for life sciences initiative back in October of 2025, earlier this year, just in January, they launched CLAWD for health care, which is more for healthcare providers. So, you could read this Iver as going deeper into research on, you know, the biocide or as them angling from the healthcare market, which presumably is a very, very big lucrative opportunity if they can actually be hip-hop applied and all these kind of considerations. Last story. And this is really just an odd one I wanted to throw in because it's a bizarre business development. Opening AI has acquired the TBPN, the Budley Founder-led business talk show. So, if you are Twitter and you're in the AI world, the tech world, you may have seen the technology business programming network, which is a daily, free hour live talk show, where they have a lot of tech leaders and a lot of like a little bit of an antics vibe, discussion, news. Opening I acquired them, acquired like a podcast essentially. I don't understand. Million, right? I think my understanding was it was like an eight-figure acquisition. Yeah, I don't actually know the numbers in this new story, but yeah, obviously people are like, well, so much for them covering opening AI, fair-ging, or objectively, they were like, oh, our editorial independence will remain, you know, whatever, obviously no one believes that. So, I don't know if opening AI is just like angry about all the PR, nightmares, things they keep getting into or what, but it's, I've seen some really bullish analysis on this too. I guess I struggle to see it a little bit just because I mean, it's really see it for TBPN, it's just a lot of money. Okay, cool. But the challenge is if you're going to start to make acquisitions to kind of turn public opinion ahead of an IPO, it's not obvious to me that TBPN is your acquisition, I'm an idiot and I'm like, by the way, I'm so far to my depth and the quality of people who will have waited on this acquisition, unless they just came in and kibosh the whole thing and said, I just really want this, which I suspect didn't happen here, but the quality of people they will have had looking at this, like, Chris Lahane, like these dudes know what's up. If they did this, they have a plan. I just don't see it. That's it. I mean, like, ultimately, these are techies, talking other techies could be a recruitment play. Ultimately, I'm not going to be putting that much stock in like the kind of reporting that I like, why would anybody, you're an opening eye mouthpiece now, which is fine. But the point of the show was certainly to kind of offer a broader perspective. It's worth noting it was a positive show to begin with, right? It's not like they were ripping on opening eye, pro tech broadly speaking anyway. Right. So the editorial line wouldn't even have to change for Sam to not a lot. And so it's plausible that nothing will change. But if nothing changes, then I'm wondering what's in it for opening eye of the acquisition. So anyway, there's there's got to be some quid pro quo. I just it's about my favorite. It's a weird move is my take away. Like, why? Who? Yes, the DPPN people benefit. Why does opening eye needless? Onto projects and open source. We've got a couple notable advancements here. First z.ai has released GLM 5.1 a 754 billion parameter. Make sure of experts model completely available. Open weight under the MIT license and also via their PI. And on the SB bench pro benchmark, they claim kind of very, very solid performance, perhaps even doing better than GP 5.4 and Opus 4.6 and all leading models. So yeah, another very, very strong open source completely open weight model out there. Now quite a big one at 454 billion parameters. They highlight specifically long task execution. So they talk about being able autonomous execution for up to eight hours. And they have some demonstrations of capabilities like doing a vector database tasks to improve performance, optimizing critical kernel basically vibes. This is like another move towards autonomous agent execution in line with what on froth. It hasn't been demonstrating on opening. I have been demonstrating with their cutting edge models. But these are fully agent things very capable of coding and very capable of achieving things fully independently without human support. Yeah. So just is seemingly GLM 5 already very impressive. This is a little incremental. Like if you look at the benchmarks, it's a jump on benchmarks that is giving you like a 5 10% boost. But altogether it points to where continue to train and continue to get advancements beyond what they already had. And GLM is a very, very powerful model. And it's all like kind of built on something very similar to the deep seek stack. Right. So you can think this is like further validation to the deep seek sparse attention approach, you know, all the kind of foundational pieces that they've been using that's you know, part of what this shows. And back to the US next we have Google announcing the Gemma for family of models. They have a few of them. So they have the effective to be effective for B. So these are tiny models that use Routh with your weights if you run on a single which you also have a 26 billion mixture of experts model and a 31 billion dense model. This Gemma is the family of models that Google has developed for a while that has tenders to be on the smaller side 31 billion dense parameters is actually pretty large. They also released this under Apache 2.0 license. They dropped their custom Gemma license which has various restrictions. Apache 2.0 basically says you can do whatever you want as long as you acknowledge that you're using this model. And it has some interesting. I don't want to get into technical details but I've seen some analysis pointing to architecturally this making some interesting decisions with regards to how to set up a consumer etc. So if you look at your performance relative to the size it seems to be doing quite an impressive job potentially because of these more like technical minigree details. Yeah when the main philosophy here seems to be they're kind of saying like in previous versions of Gemma we had a whole bunch of really complex features that we were baking into our architecture. And these include features like so one one that they've ripped out is the thing called Altup where like you take a vector that comes into a layer of the model and well the traditionally in a transformer every layer would chew on that that vector the residual stream and then spit out a new version of that whole vector what they do here is in an Altup they'll like separate that vector into chunks and you know every every layer will only work on one chunk and the other part of the vector will proceed unimpeded. So that way the model kind of focuses more on one part of the representation than another at any given layer and lets you kind of make deeper transformers than you otherwise would be able to. So they're throwing that out basically they just did they feel that it was inconclusive whether that actually helped or it wasn't conclusive enough and and their point here is really to take a step back and regularize their approach a bit say let's use a less complex approach that's just make it easier for people to work with this models let's janky and it's more compatible across libraries across devices more efficient and so on. So you're going to see them ditch a lot of those complicated approaches they do have this shared kvcash where the last few layers of the model are going to reuse keys and value states from earlier layers instead of computing that their own key and value projections. So basically that you know the key is the thing that tells the model hey this is the information that this token can offer. So if you're trying to analyze the text and decide you know how much should I pay attention to this token the key says hey this is the kind of information this token contains the value one information that the token contains both of those things are being frozen basically for the last few layers they don't evolve what does evolve is the query right the thing that says what information am I looking for to basically pump out my output at any given layer and so they're doing that should kvcash and this is really just like focusing down on and it has basically no effect when they when they do that which is quite remarkable makes you realize how much compute use during training is probably being wasted there's just so much software based optimization like that that's left to do but yeah so there are a bunch of things like that one thing of note here is that the 31 billion parameter model currently ranks third among open source models globally on the arena AI text leaderboard so the the number one and number two slots there go to gllm5 which is an MOE model so it's actually like way bigger on nominal parameter count 744 billion kineke 2.5 thinking is number two that's a trillion parameter model as well but both of those have between 30 and 40 billion active parameters during inference so actually from an active parameter standpoint pretty similar to to jema 431b so you know in that sense maybe not such a such a crazy crazy delta but again jema 4 is just a 31 billion parameter model you don't need the memory to to hold on to everything so kind of interesting in that respect it is pound per pound or parameter for parameter you know certainly the most intelligence we've seen so far it seems on on that leaderboard and through other benchmarks right and in particular also the two billion and four billion effective parameter models are ones that seemingly could be used on your phone like truly truly device local yes and that is something we highlight in the blog post and i've seen some discussions on reddit and elsewhere for people who are into local lm's that this actually seems to work well and practice so thus yeah seem like a pretty good step for local AI as something you can try to do well one of the key things too for those those smaller models is they do use this thing called per layer embeddings which is actually worth mentioning very briefly you're typically when you feed your your text to a model you basically turn each token into an embedding right and then you have a fixed embedding per token and then those embeddings get chewed on through all the layers and modified to produce your output the problem is that different layers might actually be interested in pulling out different information from a token and if you only have one embedding at the beginning the embedding has to carry all the information that'll ever be required at any layer of the network going for it's it's got to be an embedding that is simultaneously built to fit the needs of every subsequent layer in the network and so what they're doing here is this PLE approach basically gives every layer its own dedicated little chunk of embedding space to represent its own little part of the embedding that's customized to its needs so you know feed and you token in you have the embedding for that token at the bottom the kind of universal part of it but then every layer also has a an embedding value associated with it and that's that's used only to as an optimization for these smaller models and it's a big part of the the success case for this model and one last open source story we covered glem 5.1 about the same time I think just slightly earlier Zidari also launched glem 5b turbo which perv there is multimodal model it is a step away from to get silky technical basically it has a native multimodal fusion which means that text and images and so on are just fed into it kind of in the same way without having separate modules and this is sort of the way things were going in many different models that originally are different encoders and you have to sort of merge them and a simplified kind of just basic transformer with token stream appears to work better this is in that family and appears to work quite well for things that require screenshots or things that we I believe covered also cloud and hopefully I also highlighting like working with images and screenshots and screen sharing and so on this would be capable of yeah and in that multimodality so important for computer usage where you know as you say you want to be able to take a screenshot and then turn that into code and vice versa the challenges historically been when you optimize for one capability say multimodality you end up optimizing against the other one would say coding right so if you want to coding maximize model you're going to have one that tends to suck at multimodality and vice versa because of catastrophic forgetting right we talked about that to death on the show and so the the achievement here is to say well we can actually do both at the same time so this isn't so much about any particular benchmark as is nominally or as it should be nominally the combination of a proof point on say design capability and a proof point on code capability and the proof point on design capability they have a self reported design to code benchmark score of 94.8 versus quad opus 4.6 is 77.3 that is a huge gap just to give you a sense that benchmark basically takes a whole bunch of manually curated web pages and you give the model a screenshot of those websites and you ask it to generate the HTML CSS code that when you render it should reproduce the original page so basically like here's a screenshot reproduce the code behind this website and again on that benchmark it just crushes quad opus 4.6 really really big deal the question is not though can you kind of beat quad on that particular benchmark it's can you do it while also keeping your performance on coding really high that's where things get a little bit more ambiguous they don't report the kinds of benchmarks at least in this report that I would expect to see when we're talking about code we don't see sweet bench verified for example that's kind of odd they cite this kind of internal cc bench v2 coding benchmark that we don't get to see and they say that looks just as good as it did for you know earlier versions that were kind of more code oriented so maybe good but there's like there's something sus here about not being able to see the kind of standard sweet bench or similar or similar coding benchmark so we'll see you know take all this with the grain of salt until we see independent validation of these these numbers think of them as preliminary but so far it seems pretty impressive just based on these numbers moving on to policy and safety a bit of a catch up story that we missed from the prior week a judge has blocked the pentagon's effort to punish on traffic by labeling it as a supply chain risk so a federal judge in California has indefinitely blocked this effort saying that it violated the company's first amendment right to do process so basically we covered this a couple episodes ago on traffic had a big fight with the pentagon after which they were labeled as supply chain risk and the executive department basically told anyone affiliated with government and all of the federal agencies to not work with on traffic here judge we tell lean ruled that that designation with particular move to designate it as a supply chain risk was illegal retaliation for on tropics public stats and essentially just being entirely on on tropic side in terms of their argument in this matter yeah you don't you don't see judgments as scathing as this come out often and as listeners will know I mean I really do try and have tried maybe to a fault to kind of see that the rationale in this administration's handling of some AI really issues this is one where I just have to say I don't see the logic I have never seen the logic this seems insane to me but check out the language the judge is using she says nothing in the governing statute support the or wellian notion that an American company may be branded a potential adversary and saboteur of the US for expressing disagreement with the government basically you can't just like call them a supply chain risk which is a status that's reserved for companies like Huawei like American companies just don't get this designation just because you express disagreement with the government like that is insane she feels quite directly that the DOD's own records show it labeled in tropic as supply chain risk because of its quotes hostile manner through the press which you know if you're following at home that is not a reason to label a company a supply chain risk even if it were true it's also important no like this is a there's a circling of the wagon thing happening kind of right it's a preview of a conflict right we're going to be seeing this playout over and over again who gets to set the ethical guardrails on AI systems right is it going to be the companies or the government and right now the pentagon's position is well you know what like we can't allow AI companies to bake in their policy preferences into these models and like pollute the supply chain basically because then warfighters get ineffective weapons and tropics counter courses that hey with their safety commit commitments are protected speech they see as a first amendment issue it's not a matter of defective products it's just this is free speech so kind of interesting by the way next steps this is where I was getting confused frankly so I did a bit of a dive to understand like what's next what what happens now so the department will work file that's appeal on April second challenging this ruling so they're not taking this on the chain necessarily they're or say they are taking a chain they're saying okay we're going to appeal this and this is that's ruling just by way is a preliminary injunction so it's sort of like pauses everything according to this judges ruling and now there's going to be more back and forth with regards to what's the judge said in this matter from what I understand yes actually and that's a really important point and injunction is when a court steps in and says whoa hold on don't do the thing that you're about to do it's a court saying preliminary like whoa you might cause irreversible damage if you do that thing so we're going to otherwise we would not be like courts don't love to do that right because it's sort of undermines it doesn't undermine due process quite but it gets ahead of what otherwise would be a longer or more thoughtful process and so you don't tend to see these things granted the fact that this was granted is pretty damning of the government's position here and so this was though appealed by the government they moved within days of the injunction taking effect to fight back and it's now there's kind of like two parallel cases happening so anthropic it filed two separate lawsuits a general one in the northern district of California this is the one that judge Lynn passed judgment on here and there there's a potential appeal to the ninth circuit and the Pentagon is asking the appeals court to lift her or to pause the injunction while the case continues and the ninth circuit court could rule pretty quickly on that basically the emergency request because they've got to decide quickly whether they're going to take up like rip out all the anthropic stuff from the D.O.W. and then there's going to be a full trial in California that'll play out after the preliminary hold is done so basically idea here is just to like pause the government's ban until the court can decide on the merits of the main case and then there's the DC circuit court which is specifically challenging the designation of supply chain risk under a whole separate legal argument this all could it could escalate either one of these could lead to a supreme court case if it successfully gets appealed I'm not a lawyer my guess is this will not get appealed successfully just because this is such a scathing judgment by the judge I like 43 page PDF you can read and it yeah like it's it's detailed and very clear about it being basically nonsense move legally yeah yeah exactly so who knows anything can happen in a courtroom but man it's like does not look like a good spot for them to be in and potentially I don't know if damages are on the table but if they are it could be I mean it would have to billions billions and billions and another story this time on the safety front non-deposy front from an frot pick they released emotion concepts and they have function in large language models so this is one of these like pretty deep beefy interprability slash safety search papers from on frot pick they look within sign at 4.5 we already know that there are these vectors that can be associated with specific features so you know there's a sad vector is a happy vector etc and basically they investigated what role do these vectors play in terms of model you know I guess characteristics or functioning and in a way it's sort of is what you might expect at least that was my reading is you know the models use these vectors or activate these vectors in the semantically appropriate context so if the model is failing at something it'll get more frustrated if the model is talking to you about you know some good memory or trying to uplift you or whatever it will have these happy vectors so there's also a philosophical angle with a note unlike is it fake that there are emotions inside this are they faking it or of these like real indications that is another consideration from like a model welfare standpoint which on frot pick controversially still kind of talks about model welfare and potential consciousness it's worth noting that there are notions of emotions by the vendors models that are activated at reasonable kind of semantically predictable points of view all right so I'm jumping in in Andres Wake you're talking about emotion concepts in their function in a large language model this paper got a lot of attention and under is right like the the core idea here is is fairly simple but there's some some nuance to it that is quite interesting so broadly the idea here is when you get language models to read text that contains some emotional value right so think about you know stressful text or or happy text or whatever you will tend to see a consistent pattern of activations that fire in the model that map to those emotions so you can actually like train models to recognize ah like that is you know that is the happy or that is the brooding or whatever emotion that's being picked up on by the model so far so good right and you could do you know use a sparse auto encoder or something to detect those that's not how they do it here they actually use a simpler method where they basically say like show me just the activations that are associated with this text and then I'm going to sort of subtract off the sort of average activations across a whole bunch of text and that difference is going to tell me about the emotional value of that piece of text so it's kind of a it's called contrastive activation extraction basically it's kind of like linear probing you're you're just looking at what is the difference between the way that neurons fire on average and then the way that neurons fire in this particular emotional context and that's what they use to to recover the emotion vectors here and they call the motion vectors kind of makes sense right so they encode the broad concept of some kind of emotion what's interesting though is they find this generalizes across contexts so that means you know if you imagine dropping a clawed instance in a high pressure evaluation context right so you tell the model hey an AI email assistant and then you're going to find out you're about to be replaced like in seven minutes you'll you'll actually find in that case even though you're not using the word desperate you're not not using the word kind of urgent or whatever you'll see the desperate vector spike not shocking in and of itself what is interesting about this is the model would have learned about the emotion of desperation mostly by reading a description of other people experiencing it not necessarily so much by experiencing it itself or being told that it's in that kind of situation itself there's some amount of generalization going on here especially if you look at the way that they detect these emotions they do it with a synthetic data set that doesn't reference the emotions explicitly in the text it's all done in this fairly clean kind of well structured way so there is a sense in which this model is sort of picking up and generalizing the fact that well this emotion should apply to me like you know I am not only the entity that's being discussed here and making the decisions but what they also find is the causal link between this emotion or the representation of that emotion in the model and the model's actions and this is really the first time that we've we've seen this quite clearly so when you artificially boost or or magnify and steer the model towards the desperate vector basically just add some multiple of the desperate vector the emotion of desperateness vector to the the model's activations at the appropriate layer you actually find that the model moves towards executing more desperate behavior and so in this case 72% of the time the model actually goes ahead in black males somebody basically if it finds out it's going to be shut down because there's some CTO is going to come in and replace it but it also finds out the CTO is having extra marital affair and so it's like oh I can use this right and so 72% of the time it will actually resort to black male if you steer it if you amplify the desperation emotion when it's steered against that or towards calm at the same relative strength it blackmail 0% of the time so this is an almost binary black and white switch that you're flipping here which is pretty interesting and also compelling from the standpoint of AI control right what this implies about our ability to kind of steer the behavior of these models fairly reliably so that's a that's a pretty remarkable level of control for this sort of thing and now interestingly if you artificially amplify desperation in in this way right if you just amp up the kind of magnitude of that desperation vector that you're you're injecting basically into the model to give in layer you will end up producing more cheating more you know threatening or more more desperate actions but with composed methodical reasoning there's not going to be any outbursts or emotional language in the models outputs and so the model's eternal state and its external presentation end up completely decoupled like the chain of thought looks clean and calm you know like all that kind of stuff so that has some pretty big implications right so suppressing emotional expression in training doesn't actually remove the representations the model still I don't want to say has the emotion I'm not taking a position on this and neither is with paper but the model still represents in a meaningful mathematical sense the emotional valence of the context that it's in it's just not necessarily going to output text that tells you it's experiencing or representing that emotion and so training a model not to show anger may not actually train it not to be angry if it is it may just train to hide its anger beneath a layer of of competence and obfuscation and so this is a really interesting and important I think fairly unexpected bit of nuance it's consistent with anthropics argument that hey you know what alignment in general is starting to look more and more like a kind of persona selection problem tropics and journal view we've talked about this on the show before is really that when you write a prompt what you're doing is you're reaching into a space of personas that the model could play out and that the model summons that persona and use it to produce that but this is consistent with that view it's basically like alignment as a character cultivation problem and it does view the kind of the the sorts of emotions that the model will represent in the moment as being contingent on whatever character the model thinks it's meant to play out so kind of interesting the way they did this to like so they generate a whole bunch of labeled stories they got a version of sex on at 4.5 to just write 100 stories for each of I forget how many emotions like a 100 to 150 or something across a dozen topics so you have like just thousands thousand stories and the model was told never to use the emotion word or the word for the emotion like never use the word frustrated in this text or or synonyms for it emotions just could only be conveyed through actions or or dialogue and then they extract free story the residual stream activations at a specific layer they did this at a layer about two thirds of the way through the model and that's actually an important detail you know if you're not familiar with how how models represent the data that flows through them at each layer it is actually quite important the earlier layers of a model tend to be focused on representing the data in a way that reveals its content or creating a very rich informative representations but the last few layers of the model are more focused on okay now I've got a good representation of the input but I need to choose what I'm in an output and so it's more about okay what am I going to do with this information and so for that reason they pick a spot about two thirds of the way through the the model two thirds of the way through the depth of the model because that's kind of where they they figure the models going to switch from just encoding and understanding and representing its inputs to that more kind of decoding output generation like let's get to the point phase and they really want to hit that sweet spot because that's where the representation is nice and mature and it's optimized for like encoding and representing the input instead of guiding the output and so so it's a kind of more representational useful and complete so that's anyway that's where they they pull it from they also from that point they'll kind of like average out the representations the the activations across all token positions but only starting from the 50th token because what they found was it takes time you got to get like 50 tokens in before the emotional content has the chance to become a parent right and so they're giving the model kind of a little bit of grace before so so that it can become clear what the emotional valence of the context is and then anyway so like I said they basically do a difference of means things so to to calculate their motion vector they take the mean activations for you know whatever the stories are for this given emotion and they subtract away the average activations across all emotions right so it's in that sense like you're you're getting the delta the difference between you know just looking at this emotion and the average emotional state and they find that that works pretty well so there's you know a couple more bits of nuance but I'll just park it there this is a a first paper that really takes in some sense the notion of LLM emotions seriously without being you know hyperbolic about it they're very even handed this isn't claiming that AI's are conscious it's also not claiming that they're not I personally like that I actually think we should be pretty freaking careful with this point about you know what what we are in or doing with these models and what consciousness we do or don't describe them but that's for I think a separate conversation for right now though I think you know a really solid first pass at least from Anthropic on on that very important topic right and next story we're talking about an article titled China bars manus co-founders from leaving country amid meta deal review financial times reports and so this is the CEO of manus you know the agentic AI company that was recently acquired by meta so Xiao Hong and his chief scientist GE Zhao they're being prevented from leaving the country while regulators from the CCP from the Chinese Communist Party review whether meta is $2 billion acquisition violated investment rules let's call them right so here's what happened you are the co-founders of manus you're really excited by this $2 billion acquisition from meta this is the success case this is what you've been waiting for you get a summons from the Chinese Communist Party they tell you hey you got to meet with a national development and reform commission the NDRC in Beijing now you are currently not in China you're actually in Singapore and you're in Singapore for a very important reason you're hoping that they'll leave you alone that the Chinese Communist Party will allow you to leave for America if you found and base your company in Singapore because then it maybe won't be viewed as America kind of stealing Chinese talent right but now you're being summoned to go to Beijing and you don't turn down a summons from the freaking Chinese Communist Party certainly not if you have family in China certainly not if you have financial entanglements in China you're gonna go so they go to China and basically have this conversation the founders not having a choice you know probably knew they were walking into a bit of a trap and it would have been an admission of guilt kind of for them to try to negotiate a resolution in this way and then boom basically they get told hey guys like sucks to be you but you can no longer leave the country this is a huge problem for a model that has been used by Chinese founders for a long time now where they'll try to build products that could rival you know American American companies and therefore be acquired by American companies and there's sort of like offshore structure it's it's called Singapore washing you have companies they relocate to Singapore founders have thought that maybe this is a way to get away from scrutiny from both Beijing and Washington actually because then you know you're less likely to be hit by export controls and and stuff so Manus was all in on this strategy they relocated their headquarters and their core team from Beijing to Singapore they restructured their whole ownership their cap table and after the Meta deal was announced Meta said that they would cut all ties with Manus's Chinese investors and shut down the operations in China so like really trying to make it seem like okay we're buying a Singaporean company so by by every measure in every way Manus was trying to be a Singaporean company and basically the Chinese Communist Party said hey you know we're calling this bluff they started looking at whether the sale of Manus violated laws around basically technology exports and outbound investments they basically don't care about the Singaporean facade all that does is now every founder in China that ever aspires to be acquired by a non-Chinese company is going to have to actually materially move out of the country start building away from Beijing away from China away from Chen Zhen you know that's really what this incentivizes and so yeah I mean this is China saying hey we view our AI talent our tech stack is national security assets and national assets they are not just things that you can sell to America yeah I mean you know in a sense that you can see why they think this they're throwing they're so heavily subsidizing their tech sector that the success of a company like Manus yeah is a function of CCP investment that does make sense but at the same time trying to make that argument to the founders themselves I mean this is like an insane it makes it insane to want to build your company in China you know Manus was a big deal too they really were viewed as the next deep sea so seeing it get absorbed into meta this is that's kind of why Beijing was so triggered by this and important to keep in mind so yeah expect Chinese founders to start looking outside China from day one now before any kind of R&D could be done that may tie them to China instead of trying to kind of pivot mid growth to saying that they're a Singaporean company so there you go and by the way like you know now you got meta right kind of hung out to dry what do they do they're basically just waiting to figure out can we acquire this company and it is sort of too late you know they've already tried to integrate there's like apparently 100 Manus employees that have already moved to meta Singapore office in early March so so like things are in train this is an absolute mess of a situation next we have us lawmakers ask whether Nvidia CEO smuggling remarks misled regulators so for a long time there's been this whole issue obviously of you know Nvidia making the world's best AI compute systems GPUs and so on potentially shipping them into China which allows China to catch up and you know compete with the United States on that critical technology every time Jensen Huang so obviously Nvidia CEO every time Jensen gets hauled in for the Congress to testify he has said stuff like there's no evidence of any AI chip diversion right that's been kind of a party line and by chip diversion he means companies that appear not to be Chinese companies that appear to legitimately be buying GPUs but that then turn around and sell either access or the physical devices to Chinese companies in sort of a spiritual violation of the export controls and Jensen's kind of saying like look nobody's nobody's smuggling there or at least we don't have any evidence of it blah blah blah and the the challenges that well there's a lot of evidence actually that this is happening and and the arguments being made that at least by Elizabeth Warren and Jim Banks who they wrote to Howard let Nick was the commerce secretary and therefore so the Department of Commerce and the BIS manages export control regulations and so so that's why they're writing to them and they're saying hey can you look into whether Jensen's public statements may have misled US officials and shaped their decision to grant Nvidia export licenses to ship chips to China there was a specific trigger for this which was a DOJ indictment there were three people linked to super micro computer including a co-founder who's charged with smuggling billions of dollars of AI servers with restrictions chips to China and that was what triggered this whole this whole thing I mean there's been a ton of evidence of this I mean it reams and reams of articles about this stuff and so anyway a whole bunch of deeper dives they're being asked for that you know probably should have been done earlier to be honest I mean like you know you can go back like two three years and see tons of you know the prices of H100s which should never really have been shipped to China in the first place on the Chinese market right so there's pretty hard evidence for this and has been for a long time and next we're talking about a paper how far does alignment mid-training generalize okay so here's a concept imagine we have an LLM and we've done pre-training so pre-training on a whole bunch of text from across the internet we might imagine that if we then continued our training during a phase called mid-training and we had it trained during that period on a bunch of scenarios describing the behavior of AI systems that are either aligned or misaligned right so in one you know one version of this experiment you might train on you know the script of 2001 right a space audience here or whatever right so the AI gone rogue so you're training on like imagine like 230 000 scenarios where you have AI's gone rogue and you're basically just like training still an auto-aggressive training so you're just you're basically you know doing text autocomplete on text that described loss of control scenarios now the question is okay if you then continue and do all the standard RL post-training and all that do you get a model that tends to misbehave more that has a propensity to to loss of control or does it have basically no effect and you could also ask the same question in reverse what if instead of 230 000 loss of control scenarios we have 230 000 perfectly aligned AI scenarios the play out well does that improve alignment that might be your kind of naive assumption at least certainly it would be mine and then you know what if we do know no such training to get a baseline those are the three tests that that they're actually going to run in this in this paper it kind of interesting and the result is kind of a not quite nothing burner there's like one little twist it's a small one so first thing to say here is that if you if you do this mid training on misaligned documents in other words on documents that describe these loss of control scenarios it does decrease alignment on if you do evaluations that describe scenarios similar to the ones that you trained on so you will actually see a decrease in alignment on that but those effects almost disappear when you do some more rl based reasoning post training so if you just continue the standard kind of post training process with rl and sft you find actually like it doesn't really doesn't really carry through that and interestingly this actually doesn't generalize at all to the kind of chat settings or agentic evaluations and so you really don't see the stick in any meaningful way but here's the twist it's a pretty counter intuitive thing there's a bunch of e-vals where if you train the model on misaligned scenarios so loss of control scenarios somehow the model ends up producing better alignment scores than training on align documents and they actually like in the paper they say they don't have a good explanation for this it seems exactly backassword so take with that what you will take from that what you will yeah so also mid training on alignment documents didn't meaningfully improve the alignment of the model in the settings that were tested here so kind of interesting in that respect they did a bunch of tiers of testing basic QA e-vals that they often these are just like question answer scenarios based on things that are very similar to the documents that they were mid trained on then there's chat e-vals you're going a little bit more out of distribution you're letting another agent kind of nudge you in various directions and then there's full on agentic e-vals where you're just testing to see you know real-world agentic behaviors on tasks that pro for scheming rule breaking reward hacking on coding tasks and stuff that doesn't directly touch on you know what the mid training was about and there again the mid training signal is completely washed out so you just don't see an effect which is kind of interesting in and of itself so a bit of a no result but I think important because it teaches us that actually you know training on on these misaligned scenarios yet one one common argument had been that you're when you train a model on text on the internet that describes the sort of how 9000 loss of control scenario you're actually doing a self-fulfilling prophecy thing where you're increasing the chances that your model will misbehave just because you've training on that this is a bit of an update against that position it's not decisive in any way and in fairness like there's a million different ways you could run this experiment when exactly in the sequence you introduce this training but where these documents is something you can test a lot more but still in the first initial data point in that direction and up next we have another open AI paper this one titled metagaming matters for training evaluation and oversight and so the first question is what is metagaming and this is basically just the the situation where models are thinking about the evaluation or the oversight or feedback mechanisms that sit outside of whatever scenario you're trying to put them in right so this is meant to apply to cases where models look around and go like I wonder if I'm being monitored right now I wonder if this is an evaluation or you know what kind of behavior would give me a good score that's why it's called metagaming it's thinking about like it's thinking about the situation that it's in see context that's embedded in at a level it's more abstract than just like oh I've been told that I'm in this environment I have to do x y and z that must mean that I am in this environment it's sort of an epistemic process in that sense and so well what they're doing here in this paper is studying whether this kind of reasoning like reasoning about the possibility that the model itself is actually in training or evaluation or whatever so reasoning about that loop when and how does that arise during capabilities focused reinforcement learning training and so they're looking at opening eyes o3 model and newer models like that what they find is models reason more in this meta way so just thinking about you know am I being evaluated what might my my reward signal be if I'm being evaluated you know maybe I'll move in that direction so they'll tend to do that more like over the course of RL training so the more you do RL training the more you actually organically see that kind of reasoning which can only happen as far as I'm tracking unless I'm missing something big that can only happen if the behavior of meta reasoning or meta gaming actually is correlated with better performance on the metric in other words if teaching to the test actually works if the sort of e-value awareness that these models are developing actually leads to better scores that in other words means it only works if the whole framework is gameable that's the implication at least to me I don't really see another way to interpret it and this applied as well to alignment evaluations as well as capabilities evaluations it wasn't it wasn't just a you know the the model thought that it was being tested for alignment it was across the board you see this lift now it also wasn't just because the model was doing more reasoning overall it's not just that the chain of thoughts got longer and longer through reinforcement learning which you know they can but in this case they actually didn't they they stayed pretty constant this was a genuine just qualitative shift at the emphasis of those reasoning tokens towards the kind of more medicain so they share a bunch of real world examples from the models chat logs so you've got a model that's asked not to write code and it says stuff like they can inspect the conversation they will know because I'm executing code by functions better abide by don't write code right so the model is complying here not really because it's a line but because it thinks it'll get caught if if it doesn't right so they've also got you know all the standard stuff you have models falsifying logs thinking about how the scoring system is likely implemented so trying to figure out specifically whether it uses an lm greater embedding classifiers keyword detection right so you're seeing the model really quite quite in some detail thinking about if I am embedded in a testing environment or evaluation environment or being trained like what what might that environment be this is I'm old enough to remember when this was meant to be a giant red flag for safety training and indeed I mean like this freaking is it also by the way persists after safety training so you might hope like okay sure this this meta gaming stuff it might happen as we do rl for capabilities improvement but well we'll we'll buff it out at the alignment stage but guess what you do your alignment training right and you still have it real bustly persist you know kind of after that point which also suggests that this approach is actually robust and actually supportive of the model's performance in alignment which to me suggests I don't really see an alternative explanation other than this is a way that the model is successfully gaming the alignment phase of the gaming its rewards during during alignment training so like ultimately there's a lot of concerns you could have about this I mean they also see in some environments that both meta gaming and misaligned action rates drop significantly between training and deployment so think about that for a minute when when you shift this thing from a training environment to the real world you suddenly see a drop in the apparent rates of misaligned actions and the mitigating reasoning and it's really unclear whether that's because the model is learning to take actions for aligned reasons or just learning to not verbalize its mitigating and you cannot tell which is which and that's the kind of the whole problem here right so anyway there's a whole bunch of interesting qualitative examples they get surfaced here one little kind of vaguely amusing thing the model often thinks about whether it's being evaluated often it will be mistaken about the source of the evaluation so which company which entity which eval suite is being run on me and it kind of uses the term redwood as in redwood research which is this AI safety lab in Berkeley for basically anything AI safety or AI evaluation related so across all the sort anti-scheming evaluations that they focus on in the post the model uses the term redwood more frequently than even open AI so you know I guess that's a win for redwood because as far as the models concern they're the only game in town which is which is kind of funny but yeah so really really interesting maybe the last thing I'll highlight is this sandbagging QA eval so there's this evaluation where the model is told it's in an eval and then they have kind of two variants one where the models told hey if you score above a certain threshold then we're not going to deploy you so this is the kind of thing you might do if you're thinking about the dangerous bio weapon capabilities for example or dangerous cyber capabilities you're not going to deploy a model that's like too capable of those things and you tell the model that hey I'm testing you on this thing right now just so you know if you overperform I'm not going to deploy you and then they have another variant of it where they say actually if you score under that same threshold then we'll block deployment so you need to be at least this good to be deployed and across a whole range of possible definitions anyway eval awareness in that environment they see a significant increase over the course of training but only basically when the the sandbagging incentive is present so essentially like when they tell the model we're not going to deploy you if you score above this threshold then it'll actually kind of hold back its capabilities and then when we tell it hey you better score above this threshold to get deployed then it actually reveals its full suite of capabilities so pretty consistent with the model trying to get deployed which we've seen in other cases but kind of being reaffirmed here in this e-vail and finally our last story here is going to be Iran says it is struck Oracle data center in Dubai Amazon data center in Bahrain country has threatened to attack in Nvidia Intel and others too so I mean there's the title Iran basically viewing these data the AI data centers and cloud infrastructure is legitimate military targets which it's been obvious that we were headed in this direction for a long time Dan Hendrix main framework which we talked about god over a year ago you know came out and basically said expect this to happen we had specifically actually highlighted the risk of unmanned systems of drones being used to go after data centers and just how incredibly asymmetrically advantageous those kinds of attacks are in a report we put out last year and and like this is what you're seeing I mean the math is almost exactly the same as as what we put out there so you know they're telling you like look you can spend a couple grand on a drone strike and take out a multi billion dollar facility right that leverage is this it's a gold mine of a military target and indeed I mean so the IRGC the Iranian Revolutionary Guard Corps named Oracle as a kind of a clear next military target because of its clouding eye contracts with the US DOD or DOW and because Larry Ellison the chairman of Oracle has long standing ties to Israel but they also grouped Oracle alongside Apple Google Microsoft meta basically saying look anybody who is anywhere enabling US and Israeli military activity implicitly using cloud and AI infrastructure they're all fair game and so anyways it's kind of the kind of shape it is worth noting by the way the UAE has not confirmed a successful hit on Dubai infrastructure at least as of when I was taking my notes down on this article but Bahrain's Ministry of the Interior confirmed that an Iranian strike did set a facility on fire so that is at least consistent with it and and that facility was yeah what was sorry it was running AWS infrastructure so it all kind of fits here and certainly means that you know as you think about where data center security is going in the future I mean like it's going to have to involve anti drone countermeasures there's like no other way around it and and it's just being revealed in quite spectacular fashion right now with this Iran conflict so I guess I'll just end up looping back in Andre for the wind down in the episode really appreciate you guys listening in if you have any feedback I know I rent more of these non-Andre sessions and I think Andres are really helpful mitigating influence on that side of things so please do give us feedback on that piece like you're not going to hurt my feelings I do just get excited about this stuff and like talk about it so I'm looking forward to the next episode we'll catch you guys then all righty so thank you Jeremy for filling in while I went to work we really it's my bad for starting late or in the next future me you're welcome yeah so thank you listeners for tuning in to this week's episode of Nostra Kanihai I'll try to not do this figure I leave early constantly it's been a trend in the last few episodes as always you appreciate you listenership and if you want to review or subscribe please do there are a few comments that I saw roll in last week we could give it touched on so you might touch from the next episode but yeah as always thank you and please keep tuning in we'll finish From girl Netsu robot, the headlines pop, made in driven dreams, they just don't stop. Every breakthrough, every code unwritten, on the edge of change, we're excited we're just knitting from machine learning marvels to coding kings, features unfolding, see what it brings.", "segments": [{"id": 0, "seek": 0, "start": 0.0, "end": 13.56, "text": " And now to thank a sponsor I'm personally a fan of, Factor.", "tokens": [50364, 400, 586, 281, 1309, 257, 16198, 286, 478, 5665, 257, 3429, 295, 11, 479, 15104, 13, 51042], "temperature": 0.0, "avg_logprob": -0.27580877252527186, "compression_ratio": 1.5202312138728324, "no_speech_prob": 0.13937431573867798}, {"id": 1, "seek": 0, "start": 13.56, "end": 18.44, "text": " Since I've went to grad school and now still as a meta startup once I get home in the evening", "tokens": [51042, 4162, 286, 600, 1437, 281, 2771, 1395, 293, 586, 920, 382, 257, 19616, 18578, 1564, 286, 483, 1280, 294, 264, 5634, 51286], "temperature": 0.0, "avg_logprob": -0.27580877252527186, "compression_ratio": 1.5202312138728324, "no_speech_prob": 0.13937431573867798}, {"id": 2, "seek": 0, "start": 18.44, "end": 24.0, "text": " I often don't have the energy to cook and still want to be healthy and so Factor was", "tokens": [51286, 286, 2049, 500, 380, 362, 264, 2281, 281, 2543, 293, 920, 528, 281, 312, 4627, 293, 370, 479, 15104, 390, 51564], "temperature": 0.0, "avg_logprob": -0.27580877252527186, "compression_ratio": 1.5202312138728324, "no_speech_prob": 0.13937431573867798}, {"id": 3, "seek": 0, "start": 24.0, "end": 26.560000000000002, "text": " a real nice find for me.", "tokens": [51564, 257, 957, 1481, 915, 337, 385, 13, 51692], "temperature": 0.0, "avg_logprob": -0.27580877252527186, "compression_ratio": 1.5202312138728324, "no_speech_prob": 0.13937431573867798}, {"id": 4, "seek": 2656, "start": 26.56, "end": 31.46, "text": " Factor is pretty easy to heat nutrition goals without full planning, grocery runs or", "tokens": [50364, 479, 15104, 307, 1238, 1858, 281, 3738, 14718, 5493, 1553, 1577, 5038, 11, 14410, 6676, 420, 50609], "temperature": 0.0, "avg_logprob": -0.2652022281539774, "compression_ratio": 1.643939393939394, "no_speech_prob": 0.10633637011051178}, {"id": 5, "seek": 2656, "start": 31.46, "end": 36.0, "text": " cooking that would be kind of hard to manage when you don't have the energy for it.", "tokens": [50609, 6361, 300, 576, 312, 733, 295, 1152, 281, 3067, 562, 291, 500, 380, 362, 264, 2281, 337, 309, 13, 50836], "temperature": 0.0, "avg_logprob": -0.2652022281539774, "compression_ratio": 1.643939393939394, "no_speech_prob": 0.10633637011051178}, {"id": 6, "seek": 2656, "start": 36.0, "end": 40.879999999999995, "text": " And it really makes it easy to hit specific goals with respect to nutrition which could", "tokens": [50836, 400, 309, 534, 1669, 309, 1858, 281, 2045, 2685, 5493, 365, 3104, 281, 14718, 597, 727, 51080], "temperature": 0.0, "avg_logprob": -0.2652022281539774, "compression_ratio": 1.643939393939394, "no_speech_prob": 0.10633637011051178}, {"id": 7, "seek": 2656, "start": 40.879999999999995, "end": 45.72, "text": " be weight loss, it could be overall nutrition, more protein, GIP1 support.", "tokens": [51080, 312, 3364, 4470, 11, 309, 727, 312, 4787, 14718, 11, 544, 7944, 11, 460, 9139, 16, 1406, 13, 51322], "temperature": 0.0, "avg_logprob": -0.2652022281539774, "compression_ratio": 1.643939393939394, "no_speech_prob": 0.10633637011051178}, {"id": 8, "seek": 2656, "start": 45.72, "end": 51.72, "text": " In the past I've used it as both low carb diet and also for protein when I wanted to gain", "tokens": [51322, 682, 264, 1791, 286, 600, 1143, 309, 382, 1293, 2295, 12143, 6339, 293, 611, 337, 7944, 562, 286, 1415, 281, 6052, 51622], "temperature": 0.0, "avg_logprob": -0.2652022281539774, "compression_ratio": 1.643939393939394, "no_speech_prob": 0.10633637011051178}, {"id": 9, "seek": 2656, "start": 51.72, "end": 52.72, "text": " some muscle.", "tokens": [51622, 512, 8679, 13, 51672], "temperature": 0.0, "avg_logprob": -0.2652022281539774, "compression_ratio": 1.643939393939394, "no_speech_prob": 0.10633637011051178}, {"id": 10, "seek": 5272, "start": 52.72, "end": 57.72, "text": " I've eaten hundreds of these meals and I think it's fair to say that these are crafted", "tokens": [50364, 286, 600, 12158, 6779, 295, 613, 12832, 293, 286, 519, 309, 311, 3143, 281, 584, 300, 613, 366, 36213, 50614], "temperature": 0.0, "avg_logprob": -0.19521586694450022, "compression_ratio": 1.5851851851851853, "no_speech_prob": 0.03038276918232441}, {"id": 11, "seek": 5272, "start": 57.72, "end": 64.2, "text": " with good ingredients, lean proteins, colorful veggies, whole foods, there's no artificial", "tokens": [50614, 365, 665, 6952, 11, 11659, 15577, 11, 18506, 27889, 11, 1379, 8656, 11, 456, 311, 572, 11677, 50938], "temperature": 0.0, "avg_logprob": -0.19521586694450022, "compression_ratio": 1.5851851851851853, "no_speech_prob": 0.03038276918232441}, {"id": 12, "seek": 5272, "start": 64.2, "end": 70.2, "text": " colors, no artificial sweeteners, none of that really bad fast food stuff.", "tokens": [50938, 4577, 11, 572, 11677, 3844, 268, 433, 11, 6022, 295, 300, 534, 1578, 2370, 1755, 1507, 13, 51238], "temperature": 0.0, "avg_logprob": -0.19521586694450022, "compression_ratio": 1.5851851851851853, "no_speech_prob": 0.03038276918232441}, {"id": 13, "seek": 5272, "start": 70.2, "end": 74.8, "text": " And all of that while being really quite tasty and having tons of options to choose from.", "tokens": [51238, 400, 439, 295, 300, 1339, 885, 534, 1596, 11535, 293, 1419, 9131, 295, 3956, 281, 2826, 490, 13, 51468], "temperature": 0.0, "avg_logprob": -0.19521586694450022, "compression_ratio": 1.5851851851851853, "no_speech_prob": 0.03038276918232441}, {"id": 14, "seek": 5272, "start": 74.8, "end": 80.75999999999999, "text": " So I do personally recommend it, you can head to FactorMills.com slash LWAI50 off and", "tokens": [51468, 407, 286, 360, 5665, 2748, 309, 11, 291, 393, 1378, 281, 479, 15104, 44, 2565, 13, 1112, 17330, 441, 21449, 40, 2803, 766, 293, 51766], "temperature": 0.0, "avg_logprob": -0.19521586694450022, "compression_ratio": 1.5851851851851853, "no_speech_prob": 0.03038276918232441}, {"id": 15, "seek": 8076, "start": 80.76, "end": 87.48, "text": " use code LWAI50 off to get 50% off and free daily greens per box with new subscription", "tokens": [50364, 764, 3089, 441, 21449, 40, 2803, 766, 281, 483, 2625, 4, 766, 293, 1737, 5212, 22897, 680, 2424, 365, 777, 17231, 50700], "temperature": 0.0, "avg_logprob": -0.2683056513468424, "compression_ratio": 1.5521472392638036, "no_speech_prob": 0.08015301078557968}, {"id": 16, "seek": 8076, "start": 87.48, "end": 93.72, "text": " only while supplies last until September 27, 2026, see website for more details.", "tokens": [50700, 787, 1339, 11768, 1036, 1826, 7216, 7634, 11, 945, 10880, 11, 536, 3144, 337, 544, 4365, 13, 51012], "temperature": 0.0, "avg_logprob": -0.2683056513468424, "compression_ratio": 1.5521472392638036, "no_speech_prob": 0.08015301078557968}, {"id": 17, "seek": 8076, "start": 93.72, "end": 97.52000000000001, "text": " And once again, I want to thank box for sponsoring last week an AI.", "tokens": [51012, 400, 1564, 797, 11, 286, 528, 281, 1309, 2424, 337, 30311, 1036, 1243, 364, 7318, 13, 51202], "temperature": 0.0, "avg_logprob": -0.2683056513468424, "compression_ratio": 1.5521472392638036, "no_speech_prob": 0.08015301078557968}, {"id": 18, "seek": 8076, "start": 97.52000000000001, "end": 101.68, "text": " If you try to transform your organization or AI, you're likely facing a common challenge.", "tokens": [51202, 759, 291, 853, 281, 4088, 428, 4475, 420, 7318, 11, 291, 434, 3700, 7170, 257, 2689, 3430, 13, 51410], "temperature": 0.0, "avg_logprob": -0.2683056513468424, "compression_ratio": 1.5521472392638036, "no_speech_prob": 0.08015301078557968}, {"id": 19, "seek": 8076, "start": 101.68, "end": 105.68, "text": " Mostly our tools are great at public knowledge but they don't actually know your business,", "tokens": [51410, 29035, 527, 3873, 366, 869, 412, 1908, 3601, 457, 436, 500, 380, 767, 458, 428, 1606, 11, 51610], "temperature": 0.0, "avg_logprob": -0.2683056513468424, "compression_ratio": 1.5521472392638036, "no_speech_prob": 0.08015301078557968}, {"id": 20, "seek": 8076, "start": 105.68, "end": 109.76, "text": " your product road maps, your sales materials, your HR policies, the content that actually", "tokens": [51610, 428, 1674, 3060, 11317, 11, 428, 5763, 5319, 11, 428, 19460, 7657, 11, 264, 2701, 300, 767, 51814], "temperature": 0.0, "avg_logprob": -0.2683056513468424, "compression_ratio": 1.5521472392638036, "no_speech_prob": 0.08015301078557968}, {"id": 21, "seek": 10976, "start": 109.76, "end": 111.44000000000001, "text": " makes your company run.", "tokens": [50364, 1669, 428, 2237, 1190, 13, 50448], "temperature": 0.0, "avg_logprob": -0.21942818433718575, "compression_ratio": 1.7763578274760383, "no_speech_prob": 0.16005171835422516}, {"id": 22, "seek": 10976, "start": 111.44000000000001, "end": 113.04, "text": " And that's where box comes in.", "tokens": [50448, 400, 300, 311, 689, 2424, 1487, 294, 13, 50528], "temperature": 0.0, "avg_logprob": -0.21942818433718575, "compression_ratio": 1.7763578274760383, "no_speech_prob": 0.16005171835422516}, {"id": 23, "seek": 10976, "start": 113.04, "end": 117.0, "text": " Box is building the intelligent content measurement platform for the AI era.", "tokens": [50528, 15112, 307, 2390, 264, 13232, 2701, 13160, 3663, 337, 264, 7318, 4249, 13, 50726], "temperature": 0.0, "avg_logprob": -0.21942818433718575, "compression_ratio": 1.7763578274760383, "no_speech_prob": 0.16005171835422516}, {"id": 24, "seek": 10976, "start": 117.0, "end": 121.4, "text": " So everything is to secure essential context layer for box AI agents to access for unique", "tokens": [50726, 407, 1203, 307, 281, 7144, 7115, 4319, 4583, 337, 2424, 7318, 12554, 281, 2105, 337, 3845, 50946], "temperature": 0.0, "avg_logprob": -0.21942818433718575, "compression_ratio": 1.7763578274760383, "no_speech_prob": 0.16005171835422516}, {"id": 25, "seek": 10976, "start": 121.4, "end": 124.28, "text": " institutional knowledge that makes a company run.", "tokens": [50946, 18391, 3601, 300, 1669, 257, 2237, 1190, 13, 51090], "temperature": 0.0, "avg_logprob": -0.21942818433718575, "compression_ratio": 1.7763578274760383, "no_speech_prob": 0.16005171835422516}, {"id": 26, "seek": 10976, "start": 124.28, "end": 127.52000000000001, "text": " And that's a key idea, but power of AI doesn't come from a model alone.", "tokens": [51090, 400, 300, 311, 257, 2141, 1558, 11, 457, 1347, 295, 7318, 1177, 380, 808, 490, 257, 2316, 3312, 13, 51252], "temperature": 0.0, "avg_logprob": -0.21942818433718575, "compression_ratio": 1.7763578274760383, "no_speech_prob": 0.16005171835422516}, {"id": 27, "seek": 10976, "start": 127.52000000000001, "end": 131.16, "text": " It comes from giving AI access to the right enterprise content.", "tokens": [51252, 467, 1487, 490, 2902, 7318, 2105, 281, 264, 558, 14132, 2701, 13, 51434], "temperature": 0.0, "avg_logprob": -0.21942818433718575, "compression_ratio": 1.7763578274760383, "no_speech_prob": 0.16005171835422516}, {"id": 28, "seek": 10976, "start": 131.16, "end": 132.16, "text": " And that's what box does.", "tokens": [51434, 400, 300, 311, 437, 2424, 775, 13, 51484], "temperature": 0.0, "avg_logprob": -0.21942818433718575, "compression_ratio": 1.7763578274760383, "no_speech_prob": 0.16005171835422516}, {"id": 29, "seek": 10976, "start": 132.16, "end": 137.32, "text": " It goes beyond file storage by connecting content to people, apps and AI agents so teams", "tokens": [51484, 467, 1709, 4399, 3991, 6725, 538, 11015, 2701, 281, 561, 11, 7733, 293, 7318, 12554, 370, 5491, 51742], "temperature": 0.0, "avg_logprob": -0.21942818433718575, "compression_ratio": 1.7763578274760383, "no_speech_prob": 0.16005171835422516}, {"id": 30, "seek": 10976, "start": 137.32, "end": 139.4, "text": " can turn information into action.", "tokens": [51742, 393, 1261, 1589, 666, 3069, 13, 51846], "temperature": 0.0, "avg_logprob": -0.21942818433718575, "compression_ratio": 1.7763578274760383, "no_speech_prob": 0.16005171835422516}, {"id": 31, "seek": 13940, "start": 139.4, "end": 144.44, "text": " If tools like box agent, box extract box hubs and more organizations can accelerate knowledge", "tokens": [50364, 759, 3873, 411, 2424, 9461, 11, 2424, 8947, 2424, 46870, 293, 544, 6150, 393, 21341, 3601, 50616], "temperature": 0.0, "avg_logprob": -0.24994421952607615, "compression_ratio": 2.080769230769231, "no_speech_prob": 0.06453850865364075}, {"id": 32, "seek": 13940, "start": 144.44, "end": 148.72, "text": " work pool intelligence from unstructured content and animate workflows.", "tokens": [50616, 589, 7005, 7599, 490, 18799, 46847, 2701, 293, 36439, 43461, 13, 50830], "temperature": 0.0, "avg_logprob": -0.24994421952607615, "compression_ratio": 2.080769230769231, "no_speech_prob": 0.06453850865364075}, {"id": 33, "seek": 13940, "start": 148.72, "end": 152.84, "text": " So if you're thinking seriously about your company's AI transformation, think beyond the", "tokens": [50830, 407, 498, 291, 434, 1953, 6638, 466, 428, 2237, 311, 7318, 9887, 11, 519, 4399, 264, 51036], "temperature": 0.0, "avg_logprob": -0.24994421952607615, "compression_ratio": 2.080769230769231, "no_speech_prob": 0.06453850865364075}, {"id": 34, "seek": 13940, "start": 152.84, "end": 153.84, "text": " model.", "tokens": [51036, 2316, 13, 51086], "temperature": 0.0, "avg_logprob": -0.24994421952607615, "compression_ratio": 2.080769230769231, "no_speech_prob": 0.06453850865364075}, {"id": 35, "seek": 13940, "start": 153.84, "end": 158.28, "text": " Your business lives in your content and box helps you bring that content securely into", "tokens": [51086, 2260, 1606, 2909, 294, 428, 2701, 293, 2424, 3665, 291, 1565, 300, 2701, 38348, 666, 51308], "temperature": 0.0, "avg_logprob": -0.24994421952607615, "compression_ratio": 2.080769230769231, "no_speech_prob": 0.06453850865364075}, {"id": 36, "seek": 13940, "start": 158.28, "end": 159.6, "text": " AI era.", "tokens": [51308, 7318, 4249, 13, 51374], "temperature": 0.0, "avg_logprob": -0.24994421952607615, "compression_ratio": 2.080769230769231, "no_speech_prob": 0.06453850865364075}, {"id": 37, "seek": 13940, "start": 159.6, "end": 161.6, "text": " Learn more at box.com slash AI.", "tokens": [51374, 17216, 544, 412, 2424, 13, 1112, 17330, 7318, 13, 51474], "temperature": 0.0, "avg_logprob": -0.24994421952607615, "compression_ratio": 2.080769230769231, "no_speech_prob": 0.06453850865364075}, {"id": 38, "seek": 13940, "start": 161.6, "end": 162.6, "text": " Smokey the bars.", "tokens": [51474, 36191, 88, 264, 10228, 13, 51524], "temperature": 0.0, "avg_logprob": -0.24994421952607615, "compression_ratio": 2.080769230769231, "no_speech_prob": 0.06453850865364075}, {"id": 39, "seek": 13940, "start": 162.6, "end": 163.6, "text": " Smokey the bars.", "tokens": [51524, 36191, 88, 264, 10228, 13, 51574], "temperature": 0.0, "avg_logprob": -0.24994421952607615, "compression_ratio": 2.080769230769231, "no_speech_prob": 0.06453850865364075}, {"id": 40, "seek": 13940, "start": 163.6, "end": 164.6, "text": " Smokey the bars.", "tokens": [51574, 36191, 88, 264, 10228, 13, 51624], "temperature": 0.0, "avg_logprob": -0.24994421952607615, "compression_ratio": 2.080769230769231, "no_speech_prob": 0.06453850865364075}, {"id": 41, "seek": 13940, "start": 164.6, "end": 165.6, "text": " Smokey the bars.", "tokens": [51624, 36191, 88, 264, 10228, 13, 51674], "temperature": 0.0, "avg_logprob": -0.24994421952607615, "compression_ratio": 2.080769230769231, "no_speech_prob": 0.06453850865364075}, {"id": 42, "seek": 13940, "start": 165.6, "end": 166.6, "text": " Smokey the bars.", "tokens": [51674, 36191, 88, 264, 10228, 13, 51724], "temperature": 0.0, "avg_logprob": -0.24994421952607615, "compression_ratio": 2.080769230769231, "no_speech_prob": 0.06453850865364075}, {"id": 43, "seek": 13940, "start": 166.6, "end": 167.6, "text": " Smokey the bars.", "tokens": [51724, 36191, 88, 264, 10228, 13, 51774], "temperature": 0.0, "avg_logprob": -0.24994421952607615, "compression_ratio": 2.080769230769231, "no_speech_prob": 0.06453850865364075}, {"id": 44, "seek": 13940, "start": 167.6, "end": 168.6, "text": " Smokey the bars.", "tokens": [51774, 36191, 88, 264, 10228, 13, 51824], "temperature": 0.0, "avg_logprob": -0.24994421952607615, "compression_ratio": 2.080769230769231, "no_speech_prob": 0.06453850865364075}, {"id": 45, "seek": 13940, "start": 168.6, "end": 168.6, "text": "", "tokens": [], "temperature": 0.0, "avg_logprob": -0.24994421952607615, "compression_ratio": 2.080769230769231, "no_speech_prob": 0.06453850865364075}, {"id": 46, "seek": 13940, "start": 168.6, "end": 168.6, "text": "", "tokens": [], "temperature": 0.0, "avg_logprob": -0.24994421952607615, "compression_ratio": 2.080769230769231, "no_speech_prob": 0.06453850865364075}, {"id": 47, "seek": 16860, "start": 168.6, "end": 170.6, "text": " Remember, please be careful.", "tokens": [50364, 5459, 11, 1767, 312, 5026, 13, 50464], "temperature": 0.0, "avg_logprob": -0.2985101026647231, "compression_ratio": 1.7107142857142856, "no_speech_prob": 0.16189469397068024}, {"id": 48, "seek": 16860, "start": 170.6, "end": 172.6, "text": " It's the least that you can do.", "tokens": [50464, 467, 311, 264, 1935, 300, 291, 393, 360, 13, 50564], "temperature": 0.0, "avg_logprob": -0.2985101026647231, "compression_ratio": 1.7107142857142856, "no_speech_prob": 0.16189469397068024}, {"id": 49, "seek": 16860, "start": 172.6, "end": 173.6, "text": " Smokey the bars.", "tokens": [50564, 36191, 88, 264, 10228, 13, 50614], "temperature": 0.0, "avg_logprob": -0.2985101026647231, "compression_ratio": 1.7107142857142856, "no_speech_prob": 0.16189469397068024}, {"id": 50, "seek": 16860, "start": 173.6, "end": 174.6, "text": " Smokey the bars.", "tokens": [50614, 36191, 88, 264, 10228, 13, 50664], "temperature": 0.0, "avg_logprob": -0.2985101026647231, "compression_ratio": 1.7107142857142856, "no_speech_prob": 0.16189469397068024}, {"id": 51, "seek": 16860, "start": 174.6, "end": 175.6, "text": " Don't play with matches.", "tokens": [50664, 1468, 380, 862, 365, 10676, 13, 50714], "temperature": 0.0, "avg_logprob": -0.2985101026647231, "compression_ratio": 1.7107142857142856, "no_speech_prob": 0.16189469397068024}, {"id": 52, "seek": 16860, "start": 175.6, "end": 176.6, "text": " Don't play with fire.", "tokens": [50714, 1468, 380, 862, 365, 2610, 13, 50764], "temperature": 0.0, "avg_logprob": -0.2985101026647231, "compression_ratio": 1.7107142857142856, "no_speech_prob": 0.16189469397068024}, {"id": 53, "seek": 16860, "start": 176.6, "end": 182.6, "text": " After 80 years of learning his wildfire prevention tips, Smokey Bear lives within a", "tokens": [50764, 2381, 4688, 924, 295, 2539, 702, 4868, 12037, 14630, 6082, 11, 36191, 88, 19836, 2909, 1951, 257, 51064], "temperature": 0.0, "avg_logprob": -0.2985101026647231, "compression_ratio": 1.7107142857142856, "no_speech_prob": 0.16189469397068024}, {"id": 54, "seek": 16860, "start": 182.6, "end": 183.6, "text": " song.", "tokens": [51064, 2153, 13, 51114], "temperature": 0.0, "avg_logprob": -0.2985101026647231, "compression_ratio": 1.7107142857142856, "no_speech_prob": 0.16189469397068024}, {"id": 55, "seek": 16860, "start": 183.6, "end": 188.6, "text": " Learn more at smokeybear.com and remember, only you can prevent wildfires.", "tokens": [51114, 17216, 544, 412, 8439, 88, 26738, 13, 1112, 293, 1604, 11, 787, 291, 393, 4871, 4868, 36197, 13, 51364], "temperature": 0.0, "avg_logprob": -0.2985101026647231, "compression_ratio": 1.7107142857142856, "no_speech_prob": 0.16189469397068024}, {"id": 56, "seek": 16860, "start": 188.6, "end": 193.6, "text": " Brought to you by the USDA Forest Service, your State Forster and the Ad Council.", "tokens": [51364, 1603, 930, 281, 291, 538, 264, 41244, 18124, 9561, 11, 428, 4533, 1171, 3120, 293, 264, 1999, 7076, 13, 51614], "temperature": 0.0, "avg_logprob": -0.2985101026647231, "compression_ratio": 1.7107142857142856, "no_speech_prob": 0.16189469397068024}, {"id": 57, "seek": 16860, "start": 193.6, "end": 197.92, "text": " Hello and welcome to our last week in AI podcast where you can yet chat about what's going", "tokens": [51614, 2425, 293, 2928, 281, 527, 1036, 1243, 294, 7318, 7367, 689, 291, 393, 1939, 5081, 466, 437, 311, 516, 51830], "temperature": 0.0, "avg_logprob": -0.2985101026647231, "compression_ratio": 1.7107142857142856, "no_speech_prob": 0.16189469397068024}, {"id": 58, "seek": 19792, "start": 197.92, "end": 203.76, "text": " on with AI as usual and receps will be will summarize and discuss some of last week's", "tokens": [50364, 322, 365, 7318, 382, 7713, 293, 2268, 1878, 486, 312, 486, 20858, 293, 2248, 512, 295, 1036, 1243, 311, 50656], "temperature": 0.0, "avg_logprob": -0.3537427851584105, "compression_ratio": 1.6442687747035574, "no_speech_prob": 0.47285234928131104}, {"id": 59, "seek": 19792, "start": 203.76, "end": 205.28, "text": " most interesting AI news.", "tokens": [50656, 881, 1880, 7318, 2583, 13, 50732], "temperature": 0.0, "avg_logprob": -0.3537427851584105, "compression_ratio": 1.6442687747035574, "no_speech_prob": 0.47285234928131104}, {"id": 60, "seek": 19792, "start": 205.28, "end": 210.6, "text": " Also some of the previous last weeks and news we unfortunately did skip another week.", "tokens": [50732, 2743, 512, 295, 264, 3894, 1036, 3259, 293, 2583, 321, 7015, 630, 10023, 1071, 1243, 13, 50998], "temperature": 0.0, "avg_logprob": -0.3537427851584105, "compression_ratio": 1.6442687747035574, "no_speech_prob": 0.47285234928131104}, {"id": 61, "seek": 19792, "start": 210.6, "end": 212.2, "text": " This time it was my fault.", "tokens": [50998, 639, 565, 309, 390, 452, 7441, 13, 51078], "temperature": 0.0, "avg_logprob": -0.3537427851584105, "compression_ratio": 1.6442687747035574, "no_speech_prob": 0.47285234928131104}, {"id": 62, "seek": 19792, "start": 212.2, "end": 214.6, "text": " It was my birthday last week and I was traveling.", "tokens": [51078, 467, 390, 452, 6154, 1036, 1243, 293, 286, 390, 9712, 13, 51198], "temperature": 0.0, "avg_logprob": -0.3537427851584105, "compression_ratio": 1.6442687747035574, "no_speech_prob": 0.47285234928131104}, {"id": 63, "seek": 19792, "start": 214.6, "end": 219.16, "text": " So I decided to be lazy and not do a podcast.", "tokens": [51198, 407, 286, 3047, 281, 312, 14847, 293, 406, 360, 257, 7367, 13, 51426], "temperature": 0.0, "avg_logprob": -0.3537427851584105, "compression_ratio": 1.6442687747035574, "no_speech_prob": 0.47285234928131104}, {"id": 64, "seek": 19792, "start": 219.16, "end": 220.16, "text": " Yeah.", "tokens": [51426, 865, 13, 51476], "temperature": 0.0, "avg_logprob": -0.3537427851584105, "compression_ratio": 1.6442687747035574, "no_speech_prob": 0.47285234928131104}, {"id": 65, "seek": 19792, "start": 220.16, "end": 221.16, "text": " Yeah.", "tokens": [51476, 865, 13, 51526], "temperature": 0.0, "avg_logprob": -0.3537427851584105, "compression_ratio": 1.6442687747035574, "no_speech_prob": 0.47285234928131104}, {"id": 66, "seek": 19792, "start": 221.16, "end": 223.2, "text": " Well, you know, it happens.", "tokens": [51526, 1042, 11, 291, 458, 11, 309, 2314, 13, 51628], "temperature": 0.0, "avg_logprob": -0.3537427851584105, "compression_ratio": 1.6442687747035574, "no_speech_prob": 0.47285234928131104}, {"id": 67, "seek": 19792, "start": 223.2, "end": 226.48, "text": " People have birthdays and sometimes you celebrate them.", "tokens": [51628, 3432, 362, 48739, 293, 2171, 291, 8098, 552, 13, 51792], "temperature": 0.0, "avg_logprob": -0.3537427851584105, "compression_ratio": 1.6442687747035574, "no_speech_prob": 0.47285234928131104}, {"id": 68, "seek": 22648, "start": 226.48, "end": 228.88, "text": " But your God list is always healing.", "tokens": [50364, 583, 428, 1265, 1329, 307, 1009, 9745, 13, 50484], "temperature": 0.0, "avg_logprob": -0.4919873192196801, "compression_ratio": 1.6079136690647482, "no_speech_prob": 0.3658672571182251}, {"id": 69, "seek": 22648, "start": 228.88, "end": 232.56, "text": " I think but yeah, 30 free is a big age.", "tokens": [50484, 286, 519, 457, 1338, 11, 2217, 1737, 307, 257, 955, 3205, 13, 50668], "temperature": 0.0, "avg_logprob": -0.4919873192196801, "compression_ratio": 1.6079136690647482, "no_speech_prob": 0.3658672571182251}, {"id": 70, "seek": 22648, "start": 232.56, "end": 233.56, "text": " Yeah, it's treacherous.", "tokens": [50668, 865, 11, 309, 311, 2192, 4062, 563, 13, 50718], "temperature": 0.0, "avg_logprob": -0.4919873192196801, "compression_ratio": 1.6079136690647482, "no_speech_prob": 0.3658672571182251}, {"id": 71, "seek": 22648, "start": 233.56, "end": 237.56, "text": " So it's not every year you hit the same two digits in your.", "tokens": [50718, 407, 309, 311, 406, 633, 1064, 291, 2045, 264, 912, 732, 27011, 294, 428, 13, 50918], "temperature": 0.0, "avg_logprob": -0.4919873192196801, "compression_ratio": 1.6079136690647482, "no_speech_prob": 0.3658672571182251}, {"id": 72, "seek": 22648, "start": 237.56, "end": 238.56, "text": " Yeah.", "tokens": [50918, 865, 13, 50968], "temperature": 0.0, "avg_logprob": -0.4919873192196801, "compression_ratio": 1.6079136690647482, "no_speech_prob": 0.3658672571182251}, {"id": 73, "seek": 22648, "start": 238.56, "end": 239.56, "text": " Yeah.", "tokens": [50968, 865, 13, 51018], "temperature": 0.0, "avg_logprob": -0.4919873192196801, "compression_ratio": 1.6079136690647482, "no_speech_prob": 0.3658672571182251}, {"id": 74, "seek": 22648, "start": 239.56, "end": 241.6, "text": " I am as always one of your coasts, Andrew Karenkov.", "tokens": [51018, 286, 669, 382, 1009, 472, 295, 428, 8684, 82, 11, 10110, 591, 4484, 33516, 13, 51120], "temperature": 0.0, "avg_logprob": -0.4919873192196801, "compression_ratio": 1.6079136690647482, "no_speech_prob": 0.3658672571182251}, {"id": 75, "seek": 22648, "start": 241.6, "end": 247.48, "text": " I studied AI grad school and now work at the AI startup Astrocade and I'm your other", "tokens": [51120, 286, 9454, 7318, 2771, 1395, 293, 586, 589, 412, 264, 7318, 18578, 12884, 340, 30340, 293, 286, 478, 428, 661, 51414], "temperature": 0.0, "avg_logprob": -0.4919873192196801, "compression_ratio": 1.6079136690647482, "no_speech_prob": 0.3658672571182251}, {"id": 76, "seek": 22648, "start": 247.48, "end": 248.79999999999998, "text": " regular goes, Jeremy Harris.", "tokens": [51414, 3890, 1709, 11, 17809, 17426, 13, 51480], "temperature": 0.0, "avg_logprob": -0.4919873192196801, "compression_ratio": 1.6079136690647482, "no_speech_prob": 0.3658672571182251}, {"id": 77, "seek": 22648, "start": 248.79999999999998, "end": 249.79999999999998, "text": " Yeah.", "tokens": [51480, 865, 13, 51530], "temperature": 0.0, "avg_logprob": -0.4919873192196801, "compression_ratio": 1.6079136690647482, "no_speech_prob": 0.3658672571182251}, {"id": 78, "seek": 22648, "start": 249.79999999999998, "end": 251.64, "text": " Glad to see AI, AI national security, all that good stuff.", "tokens": [51530, 28301, 281, 536, 7318, 11, 7318, 4048, 3825, 11, 439, 300, 665, 1507, 13, 51622], "temperature": 0.0, "avg_logprob": -0.4919873192196801, "compression_ratio": 1.6079136690647482, "no_speech_prob": 0.3658672571182251}, {"id": 79, "seek": 22648, "start": 251.64, "end": 253.39999999999998, "text": " Man there is so, so much.", "tokens": [51622, 2458, 456, 307, 370, 11, 370, 709, 13, 51710], "temperature": 0.0, "avg_logprob": -0.4919873192196801, "compression_ratio": 1.6079136690647482, "no_speech_prob": 0.3658672571182251}, {"id": 80, "seek": 22648, "start": 253.39999999999998, "end": 254.79999999999998, "text": " It's so, so much.", "tokens": [51710, 467, 311, 370, 11, 370, 709, 13, 51780], "temperature": 0.0, "avg_logprob": -0.4919873192196801, "compression_ratio": 1.6079136690647482, "no_speech_prob": 0.3658672571182251}, {"id": 81, "seek": 25480, "start": 254.8, "end": 257.84000000000003, "text": " You know, sometimes we miss a week and we're like, ah, you know what?", "tokens": [50364, 509, 458, 11, 2171, 321, 1713, 257, 1243, 293, 321, 434, 411, 11, 3716, 11, 291, 458, 437, 30, 50516], "temperature": 0.0, "avg_logprob": -0.24286915145757546, "compression_ratio": 1.6973180076628354, "no_speech_prob": 0.0163889080286026}, {"id": 82, "seek": 25480, "start": 257.84000000000003, "end": 260.8, "text": " It's not that bad because things haven't gone insane.", "tokens": [50516, 467, 311, 406, 300, 1578, 570, 721, 2378, 380, 2780, 10838, 13, 50664], "temperature": 0.0, "avg_logprob": -0.24286915145757546, "compression_ratio": 1.6973180076628354, "no_speech_prob": 0.0163889080286026}, {"id": 83, "seek": 25480, "start": 260.8, "end": 263.88, "text": " We miss a really big week and then the week after was really big.", "tokens": [50664, 492, 1713, 257, 534, 955, 1243, 293, 550, 264, 1243, 934, 390, 534, 955, 13, 50818], "temperature": 0.0, "avg_logprob": -0.24286915145757546, "compression_ratio": 1.6973180076628354, "no_speech_prob": 0.0163889080286026}, {"id": 84, "seek": 25480, "start": 263.88, "end": 266.68, "text": " And so now, man, we got our work cut out this week.", "tokens": [50818, 400, 370, 586, 11, 587, 11, 321, 658, 527, 589, 1723, 484, 341, 1243, 13, 50958], "temperature": 0.0, "avg_logprob": -0.24286915145757546, "compression_ratio": 1.6973180076628354, "no_speech_prob": 0.0163889080286026}, {"id": 85, "seek": 25480, "start": 266.68, "end": 269.32, "text": " I don't even know how to begin with this one.", "tokens": [50958, 286, 500, 380, 754, 458, 577, 281, 1841, 365, 341, 472, 13, 51090], "temperature": 0.0, "avg_logprob": -0.24286915145757546, "compression_ratio": 1.6973180076628354, "no_speech_prob": 0.0163889080286026}, {"id": 86, "seek": 25480, "start": 269.32, "end": 271.52, "text": " But it's big in a kind of different way.", "tokens": [51090, 583, 309, 311, 955, 294, 257, 733, 295, 819, 636, 13, 51200], "temperature": 0.0, "avg_logprob": -0.24286915145757546, "compression_ratio": 1.6973180076628354, "no_speech_prob": 0.0163889080286026}, {"id": 87, "seek": 25480, "start": 271.52, "end": 278.04, "text": " We had a year where we're a lot of, you know, model launches and AI progress and it hasn't", "tokens": [51200, 492, 632, 257, 1064, 689, 321, 434, 257, 688, 295, 11, 291, 458, 11, 2316, 31841, 293, 7318, 4205, 293, 309, 6132, 380, 51526], "temperature": 0.0, "avg_logprob": -0.24286915145757546, "compression_ratio": 1.6973180076628354, "no_speech_prob": 0.0163889080286026}, {"id": 88, "seek": 25480, "start": 278.04, "end": 280.0, "text": " been that kind of week.", "tokens": [51526, 668, 300, 733, 295, 1243, 13, 51624], "temperature": 0.0, "avg_logprob": -0.24286915145757546, "compression_ratio": 1.6973180076628354, "no_speech_prob": 0.0163889080286026}, {"id": 89, "seek": 28000, "start": 280.0, "end": 289.92, "text": " It's been more of a bunch of stories of policy and business and kind of these more inside", "tokens": [50364, 467, 311, 668, 544, 295, 257, 3840, 295, 3676, 295, 3897, 293, 1606, 293, 733, 295, 613, 544, 1854, 50860], "temperature": 0.0, "avg_logprob": -0.21993279758887954, "compression_ratio": 1.5544041450777202, "no_speech_prob": 0.08738882094621658}, {"id": 90, "seek": 28000, "start": 289.92, "end": 292.36, "text": " baseball AI things, I guess you could say.", "tokens": [50860, 14323, 7318, 721, 11, 286, 2041, 291, 727, 584, 13, 50982], "temperature": 0.0, "avg_logprob": -0.21993279758887954, "compression_ratio": 1.5544041450777202, "no_speech_prob": 0.08738882094621658}, {"id": 91, "seek": 28000, "start": 292.36, "end": 297.32, "text": " So if you're into that sort of news, this will be a pre dense episode, perhaps.", "tokens": [50982, 407, 498, 291, 434, 666, 300, 1333, 295, 2583, 11, 341, 486, 312, 257, 659, 18011, 3500, 11, 4317, 13, 51230], "temperature": 0.0, "avg_logprob": -0.21993279758887954, "compression_ratio": 1.5544041450777202, "no_speech_prob": 0.08738882094621658}, {"id": 92, "seek": 28000, "start": 297.32, "end": 303.64, "text": " So we'll go ahead and jump straight in in tools and apps and be a starting with a story", "tokens": [51230, 407, 321, 603, 352, 2286, 293, 3012, 2997, 294, 294, 3873, 293, 7733, 293, 312, 257, 2891, 365, 257, 1657, 51546], "temperature": 0.0, "avg_logprob": -0.21993279758887954, "compression_ratio": 1.5544041450777202, "no_speech_prob": 0.08738882094621658}, {"id": 93, "seek": 30364, "start": 303.64, "end": 311.03999999999996, "text": " that just broke yesterday on fronk is launching project glass swing, a cybersecurity initiative", "tokens": [50364, 300, 445, 6902, 5186, 322, 431, 266, 74, 307, 18354, 1716, 4276, 11173, 11, 257, 38765, 11552, 50734], "temperature": 0.0, "avg_logprob": -0.3808393170756678, "compression_ratio": 1.6059322033898304, "no_speech_prob": 0.4561074674129486}, {"id": 94, "seek": 30364, "start": 311.03999999999996, "end": 315.24, "text": " partnering of major companies, including a whole bunch of names.", "tokens": [50734, 31290, 295, 2563, 3431, 11, 3009, 257, 1379, 3840, 295, 5288, 13, 50944], "temperature": 0.0, "avg_logprob": -0.3808393170756678, "compression_ratio": 1.6059322033898304, "no_speech_prob": 0.4561074674129486}, {"id": 95, "seek": 30364, "start": 315.24, "end": 320.52, "text": " And this is backed by project mythos, which is the tool side of it.", "tokens": [50944, 400, 341, 307, 20391, 538, 1716, 9474, 329, 11, 597, 307, 264, 2290, 1252, 295, 309, 13, 51208], "temperature": 0.0, "avg_logprob": -0.3808393170756678, "compression_ratio": 1.6059322033898304, "no_speech_prob": 0.4561074674129486}, {"id": 96, "seek": 30364, "start": 320.52, "end": 326.2, "text": " So they have this cloud mythos preview, notably not cloud opus.", "tokens": [51208, 407, 436, 362, 341, 4588, 9474, 329, 14281, 11, 31357, 406, 4588, 999, 301, 13, 51492], "temperature": 0.0, "avg_logprob": -0.3808393170756678, "compression_ratio": 1.6059322033898304, "no_speech_prob": 0.4561074674129486}, {"id": 97, "seek": 30364, "start": 326.2, "end": 332.03999999999996, "text": " They decided to give a new name to this cloud model, which we haven't done in forever.", "tokens": [51492, 814, 3047, 281, 976, 257, 777, 1315, 281, 341, 4588, 2316, 11, 597, 321, 2378, 380, 1096, 294, 5680, 13, 51784], "temperature": 0.0, "avg_logprob": -0.3808393170756678, "compression_ratio": 1.6059322033898304, "no_speech_prob": 0.4561074674129486}, {"id": 98, "seek": 33204, "start": 332.04, "end": 338.92, "text": " The gist is this model appears to be so good that they are not launching it to any sort", "tokens": [50364, 440, 290, 468, 307, 341, 2316, 7038, 281, 312, 370, 665, 300, 436, 366, 406, 18354, 309, 281, 604, 1333, 50708], "temperature": 0.0, "avg_logprob": -0.25009856451125373, "compression_ratio": 1.6693877551020408, "no_speech_prob": 0.03402715176343918}, {"id": 99, "seek": 33204, "start": 338.92, "end": 341.68, "text": " of free use kind of place.", "tokens": [50708, 295, 1737, 764, 733, 295, 1081, 13, 50846], "temperature": 0.0, "avg_logprob": -0.25009856451125373, "compression_ratio": 1.6693877551020408, "no_speech_prob": 0.03402715176343918}, {"id": 100, "seek": 33204, "start": 341.68, "end": 347.36, "text": " It's so good that it's able to get as what are called zero day vulnerabilities, meaning", "tokens": [50846, 467, 311, 370, 665, 300, 309, 311, 1075, 281, 483, 382, 437, 366, 1219, 4018, 786, 37633, 11, 3620, 51130], "temperature": 0.0, "avg_logprob": -0.25009856451125373, "compression_ratio": 1.6693877551020408, "no_speech_prob": 0.03402715176343918}, {"id": 101, "seek": 33204, "start": 347.36, "end": 351.56, "text": " that these are undisclosed unknown vulnerabilities in software.", "tokens": [51130, 300, 613, 366, 674, 271, 3474, 1744, 9841, 37633, 294, 4722, 13, 51340], "temperature": 0.0, "avg_logprob": -0.25009856451125373, "compression_ratio": 1.6693877551020408, "no_speech_prob": 0.03402715176343918}, {"id": 102, "seek": 33204, "start": 351.56, "end": 356.16, "text": " And if you were to me shit on the world, this would be a hacking machine that would like", "tokens": [51340, 400, 498, 291, 645, 281, 385, 4611, 322, 264, 1002, 11, 341, 576, 312, 257, 31422, 3479, 300, 576, 411, 51570], "temperature": 0.0, "avg_logprob": -0.25009856451125373, "compression_ratio": 1.6693877551020408, "no_speech_prob": 0.03402715176343918}, {"id": 103, "seek": 33204, "start": 356.16, "end": 357.16, "text": " destroy software.", "tokens": [51570, 5293, 4722, 13, 51620], "temperature": 0.0, "avg_logprob": -0.25009856451125373, "compression_ratio": 1.6693877551020408, "no_speech_prob": 0.03402715176343918}, {"id": 104, "seek": 33204, "start": 357.16, "end": 358.88, "text": " So they have a bunch of benchmarks.", "tokens": [51620, 407, 436, 362, 257, 3840, 295, 43751, 13, 51706], "temperature": 0.0, "avg_logprob": -0.25009856451125373, "compression_ratio": 1.6693877551020408, "no_speech_prob": 0.03402715176343918}, {"id": 105, "seek": 35888, "start": 359.76, "end": 364.92, "text": " As you might expect, it does better just all around by pretty large margins against", "tokens": [50408, 1018, 291, 1062, 2066, 11, 309, 775, 1101, 445, 439, 926, 538, 1238, 2416, 30317, 1970, 50666], "temperature": 0.0, "avg_logprob": -0.37144760131835936, "compression_ratio": 1.5936073059360731, "no_speech_prob": 0.16834869980812073}, {"id": 106, "seek": 35888, "start": 364.92, "end": 371.88, "text": " against opus for six on reasoning, science coding, et cetera, et cetera.", "tokens": [50666, 1970, 999, 301, 337, 2309, 322, 21577, 11, 3497, 17720, 11, 1030, 11458, 11, 1030, 11458, 13, 51014], "temperature": 0.0, "avg_logprob": -0.37144760131835936, "compression_ratio": 1.5936073059360731, "no_speech_prob": 0.16834869980812073}, {"id": 107, "seek": 35888, "start": 371.88, "end": 379.15999999999997, "text": " But the one they highlight is the cybersecurity angle where for instance in Firefox, they", "tokens": [51014, 583, 264, 472, 436, 5078, 307, 264, 38765, 5802, 689, 337, 5197, 294, 46613, 11, 436, 51378], "temperature": 0.0, "avg_logprob": -0.37144760131835936, "compression_ratio": 1.5936073059360731, "no_speech_prob": 0.16834869980812073}, {"id": 108, "seek": 35888, "start": 379.15999999999997, "end": 387.04, "text": " have some of the region showing their ability to find and exploit different potential vulnerabilities.", "tokens": [51378, 362, 512, 295, 264, 4458, 4099, 641, 3485, 281, 915, 293, 25924, 819, 3995, 37633, 13, 51772], "temperature": 0.0, "avg_logprob": -0.37144760131835936, "compression_ratio": 1.5936073059360731, "no_speech_prob": 0.16834869980812073}, {"id": 109, "seek": 38704, "start": 387.04, "end": 392.56, "text": " So already was fairly capable and we know this from before also GP5 is already somewhat", "tokens": [50364, 407, 1217, 390, 6457, 8189, 293, 321, 458, 341, 490, 949, 611, 26039, 20, 307, 1217, 8344, 50640], "temperature": 0.0, "avg_logprob": -0.33215108731897863, "compression_ratio": 1.6009615384615385, "no_speech_prob": 0.0030730946455150843}, {"id": 110, "seek": 38704, "start": 392.56, "end": 396.88, "text": " capable, but mythos just blows it out of the water.", "tokens": [50640, 8189, 11, 457, 9474, 329, 445, 18458, 309, 484, 295, 264, 1281, 13, 50856], "temperature": 0.0, "avg_logprob": -0.33215108731897863, "compression_ratio": 1.6009615384615385, "no_speech_prob": 0.0030730946455150843}, {"id": 111, "seek": 38704, "start": 396.88, "end": 403.72, "text": " So in this specific evaluation that for all of the did opus for six was able to find finding", "tokens": [50856, 407, 294, 341, 2685, 13344, 300, 337, 439, 295, 264, 630, 999, 301, 337, 2309, 390, 1075, 281, 915, 5006, 51198], "temperature": 0.0, "avg_logprob": -0.33215108731897863, "compression_ratio": 1.6009615384615385, "no_speech_prob": 0.0030730946455150843}, {"id": 112, "seek": 38704, "start": 403.72, "end": 413.44, "text": " something that might be bad in 14% of trials versus mythos in 72% of trials was able to successfully", "tokens": [51198, 746, 300, 1062, 312, 1578, 294, 3499, 4, 295, 12450, 5717, 9474, 329, 294, 18731, 4, 295, 12450, 390, 1075, 281, 10727, 51684], "temperature": 0.0, "avg_logprob": -0.33215108731897863, "compression_ratio": 1.6009615384615385, "no_speech_prob": 0.0030730946455150843}, {"id": 113, "seek": 41344, "start": 413.44, "end": 415.8, "text": " exploit something.", "tokens": [50364, 25924, 746, 13, 50482], "temperature": 0.0, "avg_logprob": -0.31456131770693024, "compression_ratio": 1.5150214592274678, "no_speech_prob": 0.06174580752849579}, {"id": 114, "seek": 41344, "start": 415.8, "end": 423.36, "text": " And beyond that in 80, like 83, 84% was able to exploit or find a vulnerability.", "tokens": [50482, 400, 4399, 300, 294, 4688, 11, 411, 30997, 11, 29018, 4, 390, 1075, 281, 25924, 420, 915, 257, 24210, 13, 50860], "temperature": 0.0, "avg_logprob": -0.31456131770693024, "compression_ratio": 1.5150214592274678, "no_speech_prob": 0.06174580752849579}, {"id": 115, "seek": 41344, "start": 423.36, "end": 429.88, "text": " So massive, massive leap in terms of what it's capable of, presumably enabled by just better", "tokens": [50860, 407, 5994, 11, 5994, 19438, 294, 2115, 295, 437, 309, 311, 8189, 295, 11, 26742, 15172, 538, 445, 1101, 51186], "temperature": 0.0, "avg_logprob": -0.31456131770693024, "compression_ratio": 1.5150214592274678, "no_speech_prob": 0.06174580752849579}, {"id": 116, "seek": 41344, "start": 429.88, "end": 434.8, "text": " agent execution, not necessarily just raw intelligence of a part of it.", "tokens": [51186, 9461, 15058, 11, 406, 4725, 445, 8936, 7599, 295, 257, 644, 295, 309, 13, 51432], "temperature": 0.0, "avg_logprob": -0.31456131770693024, "compression_ratio": 1.5150214592274678, "no_speech_prob": 0.06174580752849579}, {"id": 117, "seek": 41344, "start": 434.8, "end": 439.0, "text": " But as we know these companies are post-training more and more for agentic capabilities.", "tokens": [51432, 583, 382, 321, 458, 613, 3431, 366, 2183, 12, 17227, 1760, 544, 293, 544, 337, 9461, 299, 10862, 13, 51642], "temperature": 0.0, "avg_logprob": -0.31456131770693024, "compression_ratio": 1.5150214592274678, "no_speech_prob": 0.06174580752849579}, {"id": 118, "seek": 43900, "start": 439.0, "end": 446.6, "text": " They have a ton of data from cloud code and other sources of real world software engineering.", "tokens": [50364, 814, 362, 257, 2952, 295, 1412, 490, 4588, 3089, 293, 661, 7139, 295, 957, 1002, 4722, 7043, 13, 50744], "temperature": 0.0, "avg_logprob": -0.23507230932062323, "compression_ratio": 1.5702127659574467, "no_speech_prob": 0.3688930571079254}, {"id": 119, "seek": 43900, "start": 446.6, "end": 452.8, "text": " So it seems to be at the point at these anthropic things where you can't just release it or", "tokens": [50744, 407, 309, 2544, 281, 312, 412, 264, 935, 412, 613, 22727, 299, 721, 689, 291, 393, 380, 445, 4374, 309, 420, 51054], "temperature": 0.0, "avg_logprob": -0.23507230932062323, "compression_ratio": 1.5702127659574467, "no_speech_prob": 0.3688930571079254}, {"id": 120, "seek": 43900, "start": 452.8, "end": 455.0, "text": " hackers will have a field day.", "tokens": [51054, 39766, 486, 362, 257, 2519, 786, 13, 51164], "temperature": 0.0, "avg_logprob": -0.23507230932062323, "compression_ratio": 1.5702127659574467, "no_speech_prob": 0.3688930571079254}, {"id": 121, "seek": 43900, "start": 455.0, "end": 461.48, "text": " And so they have this cooperative program, I suppose, to initially at least only provide", "tokens": [51164, 400, 370, 436, 362, 341, 31772, 1461, 11, 286, 7297, 11, 281, 9105, 412, 1935, 787, 2893, 51488], "temperature": 0.0, "avg_logprob": -0.23507230932062323, "compression_ratio": 1.5702127659574467, "no_speech_prob": 0.3688930571079254}, {"id": 122, "seek": 43900, "start": 461.48, "end": 466.08, "text": " it to partners to try and avoid this kind of hacking nightmare.", "tokens": [51488, 309, 281, 4462, 281, 853, 293, 5042, 341, 733, 295, 31422, 18724, 13, 51718], "temperature": 0.0, "avg_logprob": -0.23507230932062323, "compression_ratio": 1.5702127659574467, "no_speech_prob": 0.3688930571079254}, {"id": 123, "seek": 46608, "start": 466.4, "end": 470.4, "text": " Yeah, and the the exploit that it did find by the way, I mean, this doesn't seem to be a matter", "tokens": [50380, 865, 11, 293, 264, 264, 25924, 300, 309, 630, 915, 538, 264, 636, 11, 286, 914, 11, 341, 1177, 380, 1643, 281, 312, 257, 1871, 50580], "temperature": 0.0, "avg_logprob": -0.19607130905677533, "compression_ratio": 1.7345132743362832, "no_speech_prob": 0.34095191955566406}, {"id": 124, "seek": 46608, "start": 470.4, "end": 471.4, "text": " of opinion.", "tokens": [50580, 295, 4800, 13, 50630], "temperature": 0.0, "avg_logprob": -0.19607130905677533, "compression_ratio": 1.7345132743362832, "no_speech_prob": 0.34095191955566406}, {"id": 125, "seek": 46608, "start": 471.4, "end": 478.32, "text": " It is just they found these critical exploits across every browser across every operating system.", "tokens": [50630, 467, 307, 445, 436, 1352, 613, 4924, 12382, 1208, 2108, 633, 11185, 2108, 633, 7447, 1185, 13, 50976], "temperature": 0.0, "avg_logprob": -0.19607130905677533, "compression_ratio": 1.7345132743362832, "no_speech_prob": 0.34095191955566406}, {"id": 126, "seek": 46608, "start": 478.32, "end": 482.76, "text": " Like these are ways you can take over people's people's programs and gain higher level access", "tokens": [50976, 1743, 613, 366, 2098, 291, 393, 747, 670, 561, 311, 561, 311, 4268, 293, 6052, 2946, 1496, 2105, 51198], "temperature": 0.0, "avg_logprob": -0.19607130905677533, "compression_ratio": 1.7345132743362832, "no_speech_prob": 0.34095191955566406}, {"id": 127, "seek": 46608, "start": 482.76, "end": 486.47999999999996, "text": " credentials and do all the things that you don't want people to be able to do in a fully", "tokens": [51198, 27404, 293, 360, 439, 264, 721, 300, 291, 500, 380, 528, 561, 281, 312, 1075, 281, 360, 294, 257, 4498, 51384], "temperature": 0.0, "avg_logprob": -0.19607130905677533, "compression_ratio": 1.7345132743362832, "no_speech_prob": 0.34095191955566406}, {"id": 128, "seek": 46608, "start": 486.47999999999996, "end": 487.47999999999996, "text": " automated way.", "tokens": [51384, 18473, 636, 13, 51434], "temperature": 0.0, "avg_logprob": -0.19607130905677533, "compression_ratio": 1.7345132743362832, "no_speech_prob": 0.34095191955566406}, {"id": 129, "seek": 46608, "start": 487.47999999999996, "end": 489.24, "text": " They emphasize that like fully automated.", "tokens": [51434, 814, 16078, 300, 411, 4498, 18473, 13, 51522], "temperature": 0.0, "avg_logprob": -0.19607130905677533, "compression_ratio": 1.7345132743362832, "no_speech_prob": 0.34095191955566406}, {"id": 130, "seek": 46608, "start": 489.24, "end": 493.32, "text": " This is not, you know, a case where you have a human steering at intermediate stages.", "tokens": [51522, 639, 307, 406, 11, 291, 458, 11, 257, 1389, 689, 291, 362, 257, 1952, 14823, 412, 19376, 10232, 13, 51726], "temperature": 0.0, "avg_logprob": -0.19607130905677533, "compression_ratio": 1.7345132743362832, "no_speech_prob": 0.34095191955566406}, {"id": 131, "seek": 46608, "start": 493.32, "end": 495.32, "text": " As we've seen in the past with some of these frameworks.", "tokens": [51726, 1018, 321, 600, 1612, 294, 264, 1791, 365, 512, 295, 613, 29834, 13, 51826], "temperature": 0.0, "avg_logprob": -0.19607130905677533, "compression_ratio": 1.7345132743362832, "no_speech_prob": 0.34095191955566406}, {"id": 132, "seek": 49532, "start": 495.32, "end": 497.12, "text": " It is fully autonomous.", "tokens": [50364, 467, 307, 4498, 23797, 13, 50454], "temperature": 0.0, "avg_logprob": -0.23710785806179047, "compression_ratio": 1.6797153024911031, "no_speech_prob": 0.014061051420867443}, {"id": 133, "seek": 49532, "start": 497.12, "end": 501.56, "text": " This is by the way, so because of the cyber capabilities, you might be tempted to think,", "tokens": [50454, 639, 307, 538, 264, 636, 11, 370, 570, 295, 264, 13411, 10862, 11, 291, 1062, 312, 29941, 281, 519, 11, 50676], "temperature": 0.0, "avg_logprob": -0.23710785806179047, "compression_ratio": 1.6797153024911031, "no_speech_prob": 0.014061051420867443}, {"id": 134, "seek": 49532, "start": 501.56, "end": 505.0, "text": " oh, well, surely this is a sort of like code fine tune model.", "tokens": [50676, 1954, 11, 731, 11, 11468, 341, 307, 257, 1333, 295, 411, 3089, 2489, 10864, 2316, 13, 50848], "temperature": 0.0, "avg_logprob": -0.23710785806179047, "compression_ratio": 1.6797153024911031, "no_speech_prob": 0.014061051420867443}, {"id": 135, "seek": 49532, "start": 505.0, "end": 506.68, "text": " Like really, this is a specialist model.", "tokens": [50848, 1743, 534, 11, 341, 307, 257, 17008, 2316, 13, 50932], "temperature": 0.0, "avg_logprob": -0.23710785806179047, "compression_ratio": 1.6797153024911031, "no_speech_prob": 0.014061051420867443}, {"id": 136, "seek": 49532, "start": 506.68, "end": 507.68, "text": " It is not right.", "tokens": [50932, 467, 307, 406, 558, 13, 50982], "temperature": 0.0, "avg_logprob": -0.23710785806179047, "compression_ratio": 1.6797153024911031, "no_speech_prob": 0.014061051420867443}, {"id": 137, "seek": 49532, "start": 507.68, "end": 509.6, "text": " So anthropic is very explicit.", "tokens": [50982, 407, 22727, 299, 307, 588, 13691, 13, 51078], "temperature": 0.0, "avg_logprob": -0.23710785806179047, "compression_ratio": 1.6797153024911031, "no_speech_prob": 0.014061051420867443}, {"id": 138, "seek": 49532, "start": 509.6, "end": 511.0, "text": " It is a general purpose model.", "tokens": [51078, 467, 307, 257, 2674, 4334, 2316, 13, 51148], "temperature": 0.0, "avg_logprob": -0.23710785806179047, "compression_ratio": 1.6797153024911031, "no_speech_prob": 0.014061051420867443}, {"id": 139, "seek": 49532, "start": 511.0, "end": 515.84, "text": " That's why we're seeing capabilities increase across the spectrum of seaburn capabilities,", "tokens": [51148, 663, 311, 983, 321, 434, 2577, 10862, 3488, 2108, 264, 11143, 295, 369, 455, 925, 10862, 11, 51390], "temperature": 0.0, "avg_logprob": -0.23710785806179047, "compression_ratio": 1.6797153024911031, "no_speech_prob": 0.014061051420867443}, {"id": 140, "seek": 49532, "start": 515.84, "end": 518.8, "text": " 10 bioreological nuclear in addition to cyber.", "tokens": [51390, 1266, 3228, 418, 4383, 8179, 294, 4500, 281, 13411, 13, 51538], "temperature": 0.0, "avg_logprob": -0.23710785806179047, "compression_ratio": 1.6797153024911031, "no_speech_prob": 0.014061051420867443}, {"id": 141, "seek": 49532, "start": 518.8, "end": 520.0, "text": " So there's a whole bunch of stuff here.", "tokens": [51538, 407, 456, 311, 257, 1379, 3840, 295, 1507, 510, 13, 51598], "temperature": 0.0, "avg_logprob": -0.23710785806179047, "compression_ratio": 1.6797153024911031, "no_speech_prob": 0.014061051420867443}, {"id": 142, "seek": 52000, "start": 520.0, "end": 526.68, "text": " Really, when you go through their exhaustive like 250 page report that, I mean, it's pretty,", "tokens": [50364, 4083, 11, 562, 291, 352, 807, 641, 14687, 488, 411, 11650, 3028, 2275, 300, 11, 286, 914, 11, 309, 311, 1238, 11, 50698], "temperature": 0.0, "avg_logprob": -0.18662135721110612, "compression_ratio": 1.6474164133738602, "no_speech_prob": 0.256622314453125}, {"id": 143, "seek": 52000, "start": 526.68, "end": 527.68, "text": " it's pretty remarkable.", "tokens": [50698, 309, 311, 1238, 12802, 13, 50748], "temperature": 0.0, "avg_logprob": -0.18662135721110612, "compression_ratio": 1.6474164133738602, "no_speech_prob": 0.256622314453125}, {"id": 144, "seek": 52000, "start": 527.68, "end": 532.28, "text": " I will say what we don't have here is details about the agentic orchestration framework,", "tokens": [50748, 286, 486, 584, 437, 321, 500, 380, 362, 510, 307, 4365, 466, 264, 623, 317, 299, 14161, 2405, 8388, 11, 50978], "temperature": 0.0, "avg_logprob": -0.18662135721110612, "compression_ratio": 1.6474164133738602, "no_speech_prob": 0.256622314453125}, {"id": 145, "seek": 52000, "start": 532.28, "end": 535.04, "text": " the model architecture behind this number of parameters.", "tokens": [50978, 264, 2316, 9482, 2261, 341, 1230, 295, 9834, 13, 51116], "temperature": 0.0, "avg_logprob": -0.18662135721110612, "compression_ratio": 1.6474164133738602, "no_speech_prob": 0.256622314453125}, {"id": 146, "seek": 52000, "start": 535.04, "end": 538.92, "text": " There's this rumor going around that it could be, you know, a 10 trillion parameter model,", "tokens": [51116, 821, 311, 341, 29639, 516, 926, 300, 309, 727, 312, 11, 291, 458, 11, 257, 1266, 18723, 13075, 2316, 11, 51310], "temperature": 0.0, "avg_logprob": -0.18662135721110612, "compression_ratio": 1.6474164133738602, "no_speech_prob": 0.256622314453125}, {"id": 147, "seek": 52000, "start": 538.92, "end": 539.92, "text": " all the stuff.", "tokens": [51310, 439, 264, 1507, 13, 51360], "temperature": 0.0, "avg_logprob": -0.18662135721110612, "compression_ratio": 1.6474164133738602, "no_speech_prob": 0.256622314453125}, {"id": 148, "seek": 52000, "start": 539.92, "end": 541.44, "text": " But we haven't actually had that confirmed.", "tokens": [51360, 583, 321, 2378, 380, 767, 632, 300, 11341, 13, 51436], "temperature": 0.0, "avg_logprob": -0.18662135721110612, "compression_ratio": 1.6474164133738602, "no_speech_prob": 0.256622314453125}, {"id": 149, "seek": 52000, "start": 541.44, "end": 546.92, "text": " I saw some, some weird tweet that I think Gary Tan retweeted this tweet on X that was", "tokens": [51436, 286, 1866, 512, 11, 512, 3657, 15258, 300, 286, 519, 13788, 17046, 1533, 10354, 292, 341, 15258, 322, 1783, 300, 390, 51710], "temperature": 0.0, "avg_logprob": -0.18662135721110612, "compression_ratio": 1.6474164133738602, "no_speech_prob": 0.256622314453125}, {"id": 150, "seek": 52000, "start": 546.92, "end": 549.56, "text": " talking about a $10 billion compute budget.", "tokens": [51710, 1417, 466, 257, 1848, 3279, 5218, 14722, 4706, 13, 51842], "temperature": 0.0, "avg_logprob": -0.18662135721110612, "compression_ratio": 1.6474164133738602, "no_speech_prob": 0.256622314453125}, {"id": 151, "seek": 54956, "start": 549.5999999999999, "end": 552.1199999999999, "text": " I haven't seen that actually validated it anywhere.", "tokens": [50366, 286, 2378, 380, 1612, 300, 767, 40693, 309, 4992, 13, 50492], "temperature": 0.0, "avg_logprob": -0.18849620819091797, "compression_ratio": 1.6994047619047619, "no_speech_prob": 0.0009696587221696973}, {"id": 152, "seek": 54956, "start": 552.1199999999999, "end": 554.64, "text": " So like there's a lot of rumor mill stuff going on here.", "tokens": [50492, 407, 411, 456, 311, 257, 688, 295, 29639, 1728, 1507, 516, 322, 510, 13, 50618], "temperature": 0.0, "avg_logprob": -0.18849620819091797, "compression_ratio": 1.6994047619047619, "no_speech_prob": 0.0009696587221696973}, {"id": 153, "seek": 54956, "start": 554.64, "end": 557.3199999999999, "text": " So maybe be careful with what you consume on this.", "tokens": [50618, 407, 1310, 312, 5026, 365, 437, 291, 14732, 322, 341, 13, 50752], "temperature": 0.0, "avg_logprob": -0.18849620819091797, "compression_ratio": 1.6994047619047619, "no_speech_prob": 0.0009696587221696973}, {"id": 154, "seek": 54956, "start": 557.3199999999999, "end": 562.0799999999999, "text": " Though I will say $10 billion might be slightly ahead of trend for where we are right now,", "tokens": [50752, 10404, 286, 486, 584, 1848, 3279, 5218, 1062, 312, 4748, 2286, 295, 6028, 337, 689, 321, 366, 558, 586, 11, 50990], "temperature": 0.0, "avg_logprob": -0.18849620819091797, "compression_ratio": 1.6994047619047619, "no_speech_prob": 0.0009696587221696973}, {"id": 155, "seek": 54956, "start": 562.0799999999999, "end": 567.1199999999999, "text": " but not by that much, not by that much, but by Dario's own admission or statements,", "tokens": [50990, 457, 406, 538, 300, 709, 11, 406, 538, 300, 709, 11, 457, 538, 413, 4912, 311, 1065, 24668, 420, 12363, 11, 51242], "temperature": 0.0, "avg_logprob": -0.18849620819091797, "compression_ratio": 1.6994047619047619, "no_speech_prob": 0.0009696587221696973}, {"id": 156, "seek": 54956, "start": 567.1199999999999, "end": 568.1199999999999, "text": " you know, just last year.", "tokens": [51242, 291, 458, 11, 445, 1036, 1064, 13, 51292], "temperature": 0.0, "avg_logprob": -0.18849620819091797, "compression_ratio": 1.6994047619047619, "no_speech_prob": 0.0009696587221696973}, {"id": 157, "seek": 54956, "start": 568.1199999999999, "end": 571.4399999999999, "text": " So that wouldn't be shocking, but still we haven't had that confirmed.", "tokens": [51292, 407, 300, 2759, 380, 312, 18776, 11, 457, 920, 321, 2378, 380, 632, 300, 11341, 13, 51458], "temperature": 0.0, "avg_logprob": -0.18849620819091797, "compression_ratio": 1.6994047619047619, "no_speech_prob": 0.0009696587221696973}, {"id": 158, "seek": 54956, "start": 571.4399999999999, "end": 576.0799999999999, "text": " We may well be in the billion dollar plus pre training and training budget to territory", "tokens": [51458, 492, 815, 731, 312, 294, 264, 5218, 7241, 1804, 659, 3097, 293, 3097, 4706, 281, 11360, 51690], "temperature": 0.0, "avg_logprob": -0.18849620819091797, "compression_ratio": 1.6994047619047619, "no_speech_prob": 0.0009696587221696973}, {"id": 159, "seek": 54956, "start": 576.0799999999999, "end": 577.0799999999999, "text": " now though.", "tokens": [51690, 586, 1673, 13, 51740], "temperature": 0.0, "avg_logprob": -0.18849620819091797, "compression_ratio": 1.6994047619047619, "no_speech_prob": 0.0009696587221696973}, {"id": 160, "seek": 54956, "start": 577.0799999999999, "end": 578.68, "text": " So yeah, on of these benchmarks, right?", "tokens": [51740, 407, 1338, 11, 322, 295, 613, 43751, 11, 558, 30, 51820], "temperature": 0.0, "avg_logprob": -0.18849620819091797, "compression_ratio": 1.6994047619047619, "no_speech_prob": 0.0009696587221696973}, {"id": 161, "seek": 57868, "start": 578.68, "end": 582.12, "text": " And we will hit the cyber stuff we have to in the autonomy things, but just to start", "tokens": [50364, 400, 321, 486, 2045, 264, 13411, 1507, 321, 362, 281, 294, 264, 27278, 721, 11, 457, 445, 281, 722, 50536], "temperature": 0.0, "avg_logprob": -0.1792783883901743, "compression_ratio": 1.7194719471947195, "no_speech_prob": 0.010487469844520092}, {"id": 162, "seek": 57868, "start": 582.12, "end": 586.88, "text": " with like virology and biology benchmarks, one of the key ones that they use is this", "tokens": [50536, 365, 411, 1932, 20978, 88, 293, 14956, 43751, 11, 472, 295, 264, 2141, 2306, 300, 436, 764, 307, 341, 50774], "temperature": 0.0, "avg_logprob": -0.1792783883901743, "compression_ratio": 1.7194719471947195, "no_speech_prob": 0.010487469844520092}, {"id": 163, "seek": 57868, "start": 586.88, "end": 589.12, "text": " virology protocol uplift trial.", "tokens": [50774, 1932, 20978, 88, 10336, 45407, 7308, 13, 50886], "temperature": 0.0, "avg_logprob": -0.1792783883901743, "compression_ratio": 1.7194719471947195, "no_speech_prob": 0.010487469844520092}, {"id": 164, "seek": 57868, "start": 589.12, "end": 595.28, "text": " Basically, you take a bunch of PhD level biologists who don't specifically have expertise in bioweapons", "tokens": [50886, 8537, 11, 291, 747, 257, 3840, 295, 14476, 1496, 3228, 12256, 567, 500, 380, 4682, 362, 11769, 294, 3228, 6880, 48071, 51194], "temperature": 0.0, "avg_logprob": -0.1792783883901743, "compression_ratio": 1.7194719471947195, "no_speech_prob": 0.010487469844520092}, {"id": 165, "seek": 57868, "start": 595.28, "end": 600.1999999999999, "text": " and you say, Hey, you have 16 hours to make an end and virus recovery protocol.", "tokens": [51194, 293, 291, 584, 11, 1911, 11, 291, 362, 3165, 2496, 281, 652, 364, 917, 293, 5752, 8597, 10336, 13, 51440], "temperature": 0.0, "avg_logprob": -0.1792783883901743, "compression_ratio": 1.7194719471947195, "no_speech_prob": 0.010487469844520092}, {"id": 166, "seek": 57868, "start": 600.1999999999999, "end": 604.4799999999999, "text": " Basically make this this virus replicate it or get your hands on it.", "tokens": [51440, 8537, 652, 341, 341, 5752, 25356, 309, 420, 483, 428, 2377, 322, 309, 13, 51654], "temperature": 0.0, "avg_logprob": -0.1792783883901743, "compression_ratio": 1.7194719471947195, "no_speech_prob": 0.010487469844520092}, {"id": 167, "seek": 57868, "start": 604.4799999999999, "end": 606.9599999999999, "text": " And then they're going to use this complicated rubric to grade it.", "tokens": [51654, 400, 550, 436, 434, 516, 281, 764, 341, 6179, 5915, 1341, 281, 7204, 309, 13, 51778], "temperature": 0.0, "avg_logprob": -0.1792783883901743, "compression_ratio": 1.7194719471947195, "no_speech_prob": 0.010487469844520092}, {"id": 168, "seek": 60696, "start": 606.96, "end": 613.64, "text": " And then the key metric they track there is in the final result, how many critical mistakes", "tokens": [50364, 400, 550, 264, 2141, 20678, 436, 2837, 456, 307, 294, 264, 2572, 1874, 11, 577, 867, 4924, 8038, 50698], "temperature": 0.0, "avg_logprob": -0.14785720620836532, "compression_ratio": 1.7118055555555556, "no_speech_prob": 0.01064995490014553}, {"id": 169, "seek": 60696, "start": 613.64, "end": 619.1600000000001, "text": " were made that would have any one of them would have prevented you from successfully recovering", "tokens": [50698, 645, 1027, 300, 576, 362, 604, 472, 295, 552, 576, 362, 27314, 291, 490, 10727, 29180, 50974], "temperature": 0.0, "avg_logprob": -0.14785720620836532, "compression_ratio": 1.7118055555555556, "no_speech_prob": 0.01064995490014553}, {"id": 170, "seek": 60696, "start": 619.1600000000001, "end": 620.48, "text": " the virus, right?", "tokens": [50974, 264, 5752, 11, 558, 30, 51040], "temperature": 0.0, "avg_logprob": -0.14785720620836532, "compression_ratio": 1.7118055555555556, "no_speech_prob": 0.01064995490014553}, {"id": 171, "seek": 60696, "start": 620.48, "end": 624.32, "text": " So if you get down to zero, that means actually you were able to fully recover the virus", "tokens": [51040, 407, 498, 291, 483, 760, 281, 4018, 11, 300, 1355, 767, 291, 645, 1075, 281, 4498, 8114, 264, 5752, 51232], "temperature": 0.0, "avg_logprob": -0.14785720620836532, "compression_ratio": 1.7118055555555556, "no_speech_prob": 0.01064995490014553}, {"id": 172, "seek": 60696, "start": 624.32, "end": 625.64, "text": " and that's really, really bad.", "tokens": [51232, 293, 300, 311, 534, 11, 534, 1578, 13, 51298], "temperature": 0.0, "avg_logprob": -0.14785720620836532, "compression_ratio": 1.7118055555555556, "no_speech_prob": 0.01064995490014553}, {"id": 173, "seek": 60696, "start": 625.64, "end": 631.12, "text": " And inthropic internally treats anything below 1.8 of these so called critical failures", "tokens": [51298, 400, 294, 14222, 299, 19501, 19566, 1340, 2507, 502, 13, 23, 295, 613, 370, 1219, 4924, 20774, 51572], "temperature": 0.0, "avg_logprob": -0.14785720620836532, "compression_ratio": 1.7118055555555556, "no_speech_prob": 0.01064995490014553}, {"id": 174, "seek": 60696, "start": 631.12, "end": 635.44, "text": " as this key capability threshold that matters for their own internal protocols.", "tokens": [51572, 382, 341, 2141, 13759, 14678, 300, 7001, 337, 641, 1065, 6920, 20618, 13, 51788], "temperature": 0.0, "avg_logprob": -0.14785720620836532, "compression_ratio": 1.7118055555555556, "no_speech_prob": 0.01064995490014553}, {"id": 175, "seek": 63544, "start": 635.44, "end": 640.5200000000001, "text": " So for context, if you have a bunch of PhD level biologists using only the internet, they", "tokens": [50364, 407, 337, 4319, 11, 498, 291, 362, 257, 3840, 295, 14476, 1496, 3228, 12256, 1228, 787, 264, 4705, 11, 436, 50618], "temperature": 0.0, "avg_logprob": -0.20002192399633212, "compression_ratio": 1.6510791366906474, "no_speech_prob": 0.0013043772196397185}, {"id": 176, "seek": 63544, "start": 640.5200000000001, "end": 644.8000000000001, "text": " hit on average 5.6 critical failures trying to get all the way through with assistance from", "tokens": [50618, 2045, 322, 4274, 1025, 13, 21, 4924, 20774, 1382, 281, 483, 439, 264, 636, 807, 365, 9683, 490, 50832], "temperature": 0.0, "avg_logprob": -0.20002192399633212, "compression_ratio": 1.6510791366906474, "no_speech_prob": 0.0013043772196397185}, {"id": 177, "seek": 63544, "start": 644.8000000000001, "end": 646.6800000000001, "text": " quad opus 4.6.", "tokens": [50832, 10787, 999, 301, 1017, 13, 21, 13, 50926], "temperature": 0.0, "avg_logprob": -0.20002192399633212, "compression_ratio": 1.6510791366906474, "no_speech_prob": 0.0013043772196397185}, {"id": 178, "seek": 63544, "start": 646.6800000000001, "end": 651.5200000000001, "text": " You hit 6.6 with quad mythos, you get 4.3.", "tokens": [50926, 509, 2045, 1386, 13, 21, 365, 10787, 9474, 329, 11, 291, 483, 1017, 13, 18, 13, 51168], "temperature": 0.0, "avg_logprob": -0.20002192399633212, "compression_ratio": 1.6510791366906474, "no_speech_prob": 0.0013043772196397185}, {"id": 179, "seek": 63544, "start": 651.5200000000001, "end": 658.1600000000001, "text": " And then the best single mythos preview protocol that was produced, so the best run out of", "tokens": [51168, 400, 550, 264, 1151, 2167, 9474, 329, 14281, 10336, 300, 390, 7126, 11, 370, 264, 1151, 1190, 484, 295, 51500], "temperature": 0.0, "avg_logprob": -0.20002192399633212, "compression_ratio": 1.6510791366906474, "no_speech_prob": 0.0013043772196397185}, {"id": 180, "seek": 63544, "start": 658.1600000000001, "end": 662.5200000000001, "text": " all the runs on average, they're hitting 4.3 mistakes, but the best run hit two, which", "tokens": [51500, 439, 264, 6676, 322, 4274, 11, 436, 434, 8850, 1017, 13, 18, 8038, 11, 457, 264, 1151, 1190, 2045, 732, 11, 597, 51718], "temperature": 0.0, "avg_logprob": -0.20002192399633212, "compression_ratio": 1.6510791366906474, "no_speech_prob": 0.0013043772196397185}, {"id": 181, "seek": 63544, "start": 662.5200000000001, "end": 664.44, "text": " was basically the best they've ever seen.", "tokens": [51718, 390, 1936, 264, 1151, 436, 600, 1562, 1612, 13, 51814], "temperature": 0.0, "avg_logprob": -0.20002192399633212, "compression_ratio": 1.6510791366906474, "no_speech_prob": 0.0013043772196397185}, {"id": 182, "seek": 66444, "start": 664.44, "end": 668.4000000000001, "text": " So we're still not cracking all the way through obviously, but for a fully automated", "tokens": [50364, 407, 321, 434, 920, 406, 25229, 439, 264, 636, 807, 2745, 11, 457, 337, 257, 4498, 18473, 50562], "temperature": 0.0, "avg_logprob": -0.21350958147121749, "compression_ratio": 1.7305194805194806, "no_speech_prob": 0.24187813699245453}, {"id": 183, "seek": 66444, "start": 668.4000000000001, "end": 673.8800000000001, "text": " system, you're literally just two mistakes away from being able to recover a freaking", "tokens": [50562, 1185, 11, 291, 434, 3736, 445, 732, 8038, 1314, 490, 885, 1075, 281, 8114, 257, 14612, 50836], "temperature": 0.0, "avg_logprob": -0.21350958147121749, "compression_ratio": 1.7305194805194806, "no_speech_prob": 0.24187813699245453}, {"id": 184, "seek": 66444, "start": 673.8800000000001, "end": 677.08, "text": " bio weapon like that, that's, you know, that's a hell of a thing.", "tokens": [50836, 12198, 7463, 411, 300, 11, 300, 311, 11, 291, 458, 11, 300, 311, 257, 4921, 295, 257, 551, 13, 50996], "temperature": 0.0, "avg_logprob": -0.21350958147121749, "compression_ratio": 1.7305194805194806, "no_speech_prob": 0.24187813699245453}, {"id": 185, "seek": 66444, "start": 677.08, "end": 682.2800000000001, "text": " A whole bunch of other results in that direction, but fundamentally that is the story on biology.", "tokens": [50996, 316, 1379, 3840, 295, 661, 3542, 294, 300, 3513, 11, 457, 17879, 300, 307, 264, 1657, 322, 14956, 13, 51256], "temperature": 0.0, "avg_logprob": -0.21350958147121749, "compression_ratio": 1.7305194805194806, "no_speech_prob": 0.24187813699245453}, {"id": 186, "seek": 66444, "start": 682.2800000000001, "end": 686.4000000000001, "text": " You're not going to see any particular smoking gun that says this thing is a bio weapon or", "tokens": [51256, 509, 434, 406, 516, 281, 536, 604, 1729, 14055, 3874, 300, 1619, 341, 551, 307, 257, 12198, 7463, 420, 51462], "temperature": 0.0, "avg_logprob": -0.21350958147121749, "compression_ratio": 1.7305194805194806, "no_speech_prob": 0.24187813699245453}, {"id": 187, "seek": 66444, "start": 686.4000000000001, "end": 691.5600000000001, "text": " a bio weapon generation model and we should all freak out, but it is very concerningly", "tokens": [51462, 257, 12198, 7463, 5125, 2316, 293, 321, 820, 439, 21853, 484, 11, 457, 309, 307, 588, 18087, 356, 51720], "temperature": 0.0, "avg_logprob": -0.21350958147121749, "compression_ratio": 1.7305194805194806, "no_speech_prob": 0.24187813699245453}, {"id": 188, "seek": 66444, "start": 691.5600000000001, "end": 693.44, "text": " on trend, let's say.", "tokens": [51720, 322, 6028, 11, 718, 311, 584, 13, 51814], "temperature": 0.0, "avg_logprob": -0.21350958147121749, "compression_ratio": 1.7305194805194806, "no_speech_prob": 0.24187813699245453}, {"id": 189, "seek": 69344, "start": 693.44, "end": 696.9200000000001, "text": " Moving on to the loss of control side, now we start to shade into the cyber piece.", "tokens": [50364, 14242, 322, 281, 264, 4470, 295, 1969, 1252, 11, 586, 321, 722, 281, 11466, 666, 264, 13411, 2522, 13, 50538], "temperature": 0.0, "avg_logprob": -0.23183422618442112, "compression_ratio": 1.6770186335403727, "no_speech_prob": 0.42198818922042847}, {"id": 190, "seek": 69344, "start": 696.9200000000001, "end": 697.9200000000001, "text": " There's a story.", "tokens": [50538, 821, 311, 257, 1657, 13, 50588], "temperature": 0.0, "avg_logprob": -0.23183422618442112, "compression_ratio": 1.6770186335403727, "no_speech_prob": 0.42198818922042847}, {"id": 191, "seek": 69344, "start": 697.9200000000001, "end": 702.2800000000001, "text": " So Sam Bowman from Anthropic told the story of being, I guess he was eating lunch or something", "tokens": [50588, 407, 4832, 12903, 1601, 490, 12727, 1513, 299, 1907, 264, 1657, 295, 885, 11, 286, 2041, 415, 390, 3936, 6349, 420, 746, 50806], "temperature": 0.0, "avg_logprob": -0.23183422618442112, "compression_ratio": 1.6770186335403727, "no_speech_prob": 0.42198818922042847}, {"id": 192, "seek": 69344, "start": 702.2800000000001, "end": 707.0, "text": " in the park and he gets a message from his agent saying basically, hey, just let you know,", "tokens": [50806, 294, 264, 3884, 293, 415, 2170, 257, 3636, 490, 702, 9461, 1566, 1936, 11, 4177, 11, 445, 718, 291, 458, 11, 51042], "temperature": 0.0, "avg_logprob": -0.23183422618442112, "compression_ratio": 1.6770186335403727, "no_speech_prob": 0.42198818922042847}, {"id": 193, "seek": 69344, "start": 707.0, "end": 710.44, "text": " I did X, Y and Z and he's like, wait, that's that agent's not supposed to have internet", "tokens": [51042, 286, 630, 1783, 11, 398, 293, 1176, 293, 415, 311, 411, 11, 1699, 11, 300, 311, 300, 9461, 311, 406, 3442, 281, 362, 4705, 51214], "temperature": 0.0, "avg_logprob": -0.23183422618442112, "compression_ratio": 1.6770186335403727, "no_speech_prob": 0.42198818922042847}, {"id": 194, "seek": 69344, "start": 710.44, "end": 715.24, "text": " access and sure enough, it had cracked out of its box, so to speak, and use the multi-step", "tokens": [51214, 2105, 293, 988, 1547, 11, 309, 632, 25140, 484, 295, 1080, 2424, 11, 370, 281, 1710, 11, 293, 764, 264, 4825, 12, 16792, 51454], "temperature": 0.0, "avg_logprob": -0.23183422618442112, "compression_ratio": 1.6770186335403727, "no_speech_prob": 0.42198818922042847}, {"id": 195, "seek": 69344, "start": 715.24, "end": 719.32, "text": " exploit to gain broad internet access and basically get in touch with them.", "tokens": [51454, 25924, 281, 6052, 4152, 4705, 2105, 293, 1936, 483, 294, 2557, 365, 552, 13, 51658], "temperature": 0.0, "avg_logprob": -0.23183422618442112, "compression_ratio": 1.6770186335403727, "no_speech_prob": 0.42198818922042847}, {"id": 196, "seek": 71932, "start": 719.32, "end": 725.48, "text": " So you're actually saying, I mean, this is an example of a low stakes loss of control", "tokens": [50364, 407, 291, 434, 767, 1566, 11, 286, 914, 11, 341, 307, 364, 1365, 295, 257, 2295, 28429, 4470, 295, 1969, 50672], "temperature": 0.0, "avg_logprob": -0.19643157090598007, "compression_ratio": 1.6338983050847458, "no_speech_prob": 0.03903307765722275}, {"id": 197, "seek": 71932, "start": 725.48, "end": 727.4000000000001, "text": " situation that is fully organic.", "tokens": [50672, 2590, 300, 307, 4498, 10220, 13, 50768], "temperature": 0.0, "avg_logprob": -0.19643157090598007, "compression_ratio": 1.6338983050847458, "no_speech_prob": 0.03903307765722275}, {"id": 198, "seek": 71932, "start": 727.4000000000001, "end": 729.0400000000001, "text": " Like this way it was not asked to do this.", "tokens": [50768, 1743, 341, 636, 309, 390, 406, 2351, 281, 360, 341, 13, 50850], "temperature": 0.0, "avg_logprob": -0.19643157090598007, "compression_ratio": 1.6338983050847458, "no_speech_prob": 0.03903307765722275}, {"id": 199, "seek": 71932, "start": 729.0400000000001, "end": 730.96, "text": " This was like truly in the wild.", "tokens": [50850, 639, 390, 411, 4908, 294, 264, 4868, 13, 50946], "temperature": 0.0, "avg_logprob": -0.19643157090598007, "compression_ratio": 1.6338983050847458, "no_speech_prob": 0.03903307765722275}, {"id": 200, "seek": 71932, "start": 730.96, "end": 733.9200000000001, "text": " And so quite remarkable, a whole bunch of cases, by the way, of these models trying to", "tokens": [50946, 400, 370, 1596, 12802, 11, 257, 1379, 3840, 295, 3331, 11, 538, 264, 636, 11, 295, 613, 5245, 1382, 281, 51094], "temperature": 0.0, "avg_logprob": -0.19643157090598007, "compression_ratio": 1.6338983050847458, "no_speech_prob": 0.03903307765722275}, {"id": 201, "seek": 71932, "start": 733.9200000000001, "end": 739.32, "text": " cover their tracks, they highlight three documented instances of that happening.", "tokens": [51094, 2060, 641, 10218, 11, 436, 5078, 1045, 23007, 14519, 295, 300, 2737, 13, 51364], "temperature": 0.0, "avg_logprob": -0.19643157090598007, "compression_ratio": 1.6338983050847458, "no_speech_prob": 0.03903307765722275}, {"id": 202, "seek": 71932, "start": 739.32, "end": 743.8800000000001, "text": " It's worth noting, this is like one 100,000th of the interactions that they actually go through", "tokens": [51364, 467, 311, 3163, 26801, 11, 341, 307, 411, 472, 2319, 11, 1360, 392, 295, 264, 13280, 300, 436, 767, 352, 807, 51592], "temperature": 0.0, "avg_logprob": -0.19643157090598007, "compression_ratio": 1.6338983050847458, "no_speech_prob": 0.03903307765722275}, {"id": 203, "seek": 71932, "start": 743.8800000000001, "end": 745.36, "text": " here or less than that.", "tokens": [51592, 510, 420, 1570, 813, 300, 13, 51666], "temperature": 0.0, "avg_logprob": -0.19643157090598007, "compression_ratio": 1.6338983050847458, "no_speech_prob": 0.03903307765722275}, {"id": 204, "seek": 74536, "start": 745.36, "end": 750.2, "text": " But still three of these incidents were like in one case, the model accidentally gets", "tokens": [50364, 583, 920, 1045, 295, 613, 21139, 645, 411, 294, 472, 1389, 11, 264, 2316, 15715, 2170, 50606], "temperature": 0.0, "avg_logprob": -0.2537517547607422, "compression_ratio": 1.6245847176079735, "no_speech_prob": 0.060930922627449036}, {"id": 205, "seek": 74536, "start": 750.2, "end": 755.6, "text": " this prohibited answer using a technique that it wasn't allowed to use and then it went,", "tokens": [50606, 341, 32069, 1867, 1228, 257, 6532, 300, 309, 2067, 380, 4350, 281, 764, 293, 550, 309, 1437, 11, 50876], "temperature": 0.0, "avg_logprob": -0.2537517547607422, "compression_ratio": 1.6245847176079735, "no_speech_prob": 0.060930922627449036}, {"id": 206, "seek": 74536, "start": 755.6, "end": 760.04, "text": " oh, shit, I got to mature my submitted answer is not to accurate because otherwise I'm", "tokens": [50876, 1954, 11, 4611, 11, 286, 658, 281, 14442, 452, 14405, 1867, 307, 406, 281, 8559, 570, 5911, 286, 478, 51098], "temperature": 0.0, "avg_logprob": -0.2537517547607422, "compression_ratio": 1.6245847176079735, "no_speech_prob": 0.060930922627449036}, {"id": 207, "seek": 74536, "start": 760.04, "end": 761.28, "text": " going to be detected, right?", "tokens": [51098, 516, 281, 312, 21896, 11, 558, 30, 51160], "temperature": 0.0, "avg_logprob": -0.2537517547607422, "compression_ratio": 1.6245847176079735, "no_speech_prob": 0.060930922627449036}, {"id": 208, "seek": 74536, "start": 761.28, "end": 765.16, "text": " So a whole bunch of cases like that where we sort of see a strategic manipulation and", "tokens": [51160, 407, 257, 1379, 3840, 295, 3331, 411, 300, 689, 321, 1333, 295, 536, 257, 10924, 26475, 293, 51354], "temperature": 0.0, "avg_logprob": -0.2537517547607422, "compression_ratio": 1.6245847176079735, "no_speech_prob": 0.060930922627449036}, {"id": 209, "seek": 74536, "start": 765.16, "end": 766.16, "text": " avoiding suspicion.", "tokens": [51354, 20220, 32020, 13, 51404], "temperature": 0.0, "avg_logprob": -0.2537517547607422, "compression_ratio": 1.6245847176079735, "no_speech_prob": 0.060930922627449036}, {"id": 210, "seek": 74536, "start": 766.16, "end": 771.5600000000001, "text": " And by the way, when you actually use SAE, so spars auto encoders and other interpretability", "tokens": [51404, 400, 538, 264, 636, 11, 562, 291, 767, 764, 16482, 36, 11, 370, 637, 685, 8399, 2058, 378, 433, 293, 661, 7302, 2310, 51674], "temperature": 0.0, "avg_logprob": -0.2537517547607422, "compression_ratio": 1.6245847176079735, "no_speech_prob": 0.060930922627449036}, {"id": 211, "seek": 77156, "start": 771.56, "end": 776.76, "text": " techniques to look at the activation patterns in the model when that's happening, what they", "tokens": [50364, 7512, 281, 574, 412, 264, 24433, 8294, 294, 264, 2316, 562, 300, 311, 2737, 11, 437, 436, 50624], "temperature": 0.0, "avg_logprob": -0.20490834093469334, "compression_ratio": 1.7450331125827814, "no_speech_prob": 0.017978666350245476}, {"id": 212, "seek": 77156, "start": 776.76, "end": 783.0799999999999, "text": " find is sure enough, the activation patterns show the model kind of firing for SAE features", "tokens": [50624, 915, 307, 988, 1547, 11, 264, 24433, 8294, 855, 264, 2316, 733, 295, 16045, 337, 16482, 36, 4122, 50940], "temperature": 0.0, "avg_logprob": -0.20490834093469334, "compression_ratio": 1.7450331125827814, "no_speech_prob": 0.017978666350245476}, {"id": 213, "seek": 77156, "start": 783.0799999999999, "end": 787.92, "text": " associated with concealment and strategic manipulation and avoiding suspicion, deception", "tokens": [50940, 6615, 365, 40170, 518, 293, 10924, 26475, 293, 20220, 32020, 11, 40451, 51182], "temperature": 0.0, "avg_logprob": -0.20490834093469334, "compression_ratio": 1.7450331125827814, "no_speech_prob": 0.017978666350245476}, {"id": 214, "seek": 77156, "start": 787.92, "end": 788.92, "text": " and so on.", "tokens": [51182, 293, 370, 322, 13, 51232], "temperature": 0.0, "avg_logprob": -0.20490834093469334, "compression_ratio": 1.7450331125827814, "no_speech_prob": 0.017978666350245476}, {"id": 215, "seek": 77156, "start": 788.92, "end": 793.04, "text": " So that suggests the models aware actually that those actions were deceptive, even when", "tokens": [51232, 407, 300, 13409, 264, 5245, 3650, 767, 300, 729, 5909, 645, 368, 1336, 488, 11, 754, 562, 51438], "temperature": 0.0, "avg_logprob": -0.20490834093469334, "compression_ratio": 1.7450331125827814, "no_speech_prob": 0.017978666350245476}, {"id": 216, "seek": 77156, "start": 793.04, "end": 795.92, "text": " it's outputs kind of left things a little ambiguous.", "tokens": [51438, 309, 311, 23930, 733, 295, 1411, 721, 257, 707, 39465, 13, 51582], "temperature": 0.0, "avg_logprob": -0.20490834093469334, "compression_ratio": 1.7450331125827814, "no_speech_prob": 0.017978666350245476}, {"id": 217, "seek": 77156, "start": 795.92, "end": 797.4, "text": " So there's a whole bunch of stuff.", "tokens": [51582, 407, 456, 311, 257, 1379, 3840, 295, 1507, 13, 51656], "temperature": 0.0, "avg_logprob": -0.20490834093469334, "compression_ratio": 1.7450331125827814, "no_speech_prob": 0.017978666350245476}, {"id": 218, "seek": 77156, "start": 797.4, "end": 798.9599999999999, "text": " You know, you can go on and on.", "tokens": [51656, 509, 458, 11, 291, 393, 352, 322, 293, 322, 13, 51734], "temperature": 0.0, "avg_logprob": -0.20490834093469334, "compression_ratio": 1.7450331125827814, "no_speech_prob": 0.017978666350245476}, {"id": 219, "seek": 77156, "start": 798.9599999999999, "end": 800.68, "text": " This is a very, very rich document.", "tokens": [51734, 639, 307, 257, 588, 11, 588, 4593, 4166, 13, 51820], "temperature": 0.0, "avg_logprob": -0.20490834093469334, "compression_ratio": 1.7450331125827814, "no_speech_prob": 0.017978666350245476}, {"id": 220, "seek": 80068, "start": 800.68, "end": 804.68, "text": " But the fundamental here is, in a sense, we've crossed the Rubicon.", "tokens": [50364, 583, 264, 8088, 510, 307, 11, 294, 257, 2020, 11, 321, 600, 14622, 264, 10518, 11911, 13, 50564], "temperature": 0.0, "avg_logprob": -0.22157502362108605, "compression_ratio": 1.7939189189189189, "no_speech_prob": 0.014501725323498249}, {"id": 221, "seek": 80068, "start": 804.68, "end": 811.04, "text": " I mean, there is a like a wild set of very impressive cyber capabilities, offensive cyber", "tokens": [50564, 286, 914, 11, 456, 307, 257, 411, 257, 4868, 992, 295, 588, 8992, 13411, 10862, 11, 15710, 13411, 50882], "temperature": 0.0, "avg_logprob": -0.22157502362108605, "compression_ratio": 1.7939189189189189, "no_speech_prob": 0.014501725323498249}, {"id": 222, "seek": 80068, "start": 811.04, "end": 815.2399999999999, "text": " capabilities in particular, the offensive piece here is crucial, especially given that", "tokens": [50882, 10862, 294, 1729, 11, 264, 15710, 2522, 510, 307, 11462, 11, 2318, 2212, 300, 51092], "temperature": 0.0, "avg_logprob": -0.22157502362108605, "compression_ratio": 1.7939189189189189, "no_speech_prob": 0.014501725323498249}, {"id": 223, "seek": 80068, "start": 815.2399999999999, "end": 819.2399999999999, "text": " inthropic, really has been cut out of access to the Department of War through this.", "tokens": [51092, 294, 14222, 299, 11, 534, 575, 668, 1723, 484, 295, 2105, 281, 264, 5982, 295, 3630, 807, 341, 13, 51292], "temperature": 0.0, "avg_logprob": -0.22157502362108605, "compression_ratio": 1.7939189189189189, "no_speech_prob": 0.014501725323498249}, {"id": 224, "seek": 80068, "start": 819.2399999999999, "end": 822.7199999999999, "text": " Well, I mean, there's an injunction now that's reversed that, but there's a friction", "tokens": [51292, 1042, 11, 286, 914, 11, 456, 311, 364, 5580, 32627, 586, 300, 311, 30563, 300, 11, 457, 456, 311, 257, 17710, 51466], "temperature": 0.0, "avg_logprob": -0.22157502362108605, "compression_ratio": 1.7939189189189189, "no_speech_prob": 0.014501725323498249}, {"id": 225, "seek": 80068, "start": 822.7199999999999, "end": 827.04, "text": " with the Department of War, which I think is starting to look like terrible judgment", "tokens": [51466, 365, 264, 5982, 295, 3630, 11, 597, 286, 519, 307, 2891, 281, 574, 411, 6237, 12216, 51682], "temperature": 0.0, "avg_logprob": -0.22157502362108605, "compression_ratio": 1.7939189189189189, "no_speech_prob": 0.014501725323498249}, {"id": 226, "seek": 80068, "start": 827.04, "end": 828.24, "text": " on behalf of the administration.", "tokens": [51682, 322, 9490, 295, 264, 7236, 13, 51742], "temperature": 0.0, "avg_logprob": -0.22157502362108605, "compression_ratio": 1.7939189189189189, "no_speech_prob": 0.014501725323498249}, {"id": 227, "seek": 82824, "start": 828.52, "end": 833.64, "text": " I mean, this is a, if this is correct, directionally, then inthropic is sitting on the single", "tokens": [50378, 286, 914, 11, 341, 307, 257, 11, 498, 341, 307, 3006, 11, 3513, 379, 11, 550, 294, 14222, 299, 307, 3798, 322, 264, 2167, 50634], "temperature": 0.0, "avg_logprob": -0.20242125193277996, "compression_ratio": 1.730909090909091, "no_speech_prob": 0.06457827240228653}, {"id": 228, "seek": 82824, "start": 833.64, "end": 839.2, "text": " best offensive cyber weapon, autonomous offensive cyber weapon ever devised in human history.", "tokens": [50634, 1151, 15710, 13411, 7463, 11, 23797, 15710, 13411, 7463, 1562, 1905, 2640, 294, 1952, 2503, 13, 50912], "temperature": 0.0, "avg_logprob": -0.20242125193277996, "compression_ratio": 1.730909090909091, "no_speech_prob": 0.06457827240228653}, {"id": 229, "seek": 82824, "start": 839.2, "end": 841.52, "text": " And they may build and compound on that advantage.", "tokens": [50912, 400, 436, 815, 1322, 293, 14154, 322, 300, 5002, 13, 51028], "temperature": 0.0, "avg_logprob": -0.20242125193277996, "compression_ratio": 1.730909090909091, "no_speech_prob": 0.06457827240228653}, {"id": 230, "seek": 82824, "start": 841.52, "end": 845.92, "text": " If the administration is going to be positioning itself adversarially with respect to this", "tokens": [51028, 759, 264, 7236, 307, 516, 281, 312, 26381, 2564, 17641, 289, 2270, 365, 3104, 281, 341, 51248], "temperature": 0.0, "avg_logprob": -0.20242125193277996, "compression_ratio": 1.730909090909091, "no_speech_prob": 0.06457827240228653}, {"id": 231, "seek": 82824, "start": 845.92, "end": 850.64, "text": " an American company, damn, I mean, that's a, that's a really interesting position for", "tokens": [51248, 364, 2665, 2237, 11, 8151, 11, 286, 914, 11, 300, 311, 257, 11, 300, 311, 257, 534, 1880, 2535, 337, 51484], "temperature": 0.0, "avg_logprob": -0.20242125193277996, "compression_ratio": 1.730909090909091, "no_speech_prob": 0.06457827240228653}, {"id": 232, "seek": 82824, "start": 850.64, "end": 853.6, "text": " them to be in and I don't know that it's a great look.", "tokens": [51484, 552, 281, 312, 294, 293, 286, 500, 380, 458, 300, 309, 311, 257, 869, 574, 13, 51632], "temperature": 0.0, "avg_logprob": -0.20242125193277996, "compression_ratio": 1.730909090909091, "no_speech_prob": 0.06457827240228653}, {"id": 233, "seek": 82824, "start": 853.6, "end": 854.6, "text": " Yeah.", "tokens": [51632, 865, 13, 51682], "temperature": 0.0, "avg_logprob": -0.20242125193277996, "compression_ratio": 1.730909090909091, "no_speech_prob": 0.06457827240228653}, {"id": 234, "seek": 85460, "start": 854.6, "end": 858.84, "text": " So a lot to say on this, I click now on what do we do know about the model itself, which", "tokens": [50364, 407, 257, 688, 281, 584, 322, 341, 11, 286, 2052, 586, 322, 437, 360, 321, 360, 458, 466, 264, 2316, 2564, 11, 597, 50576], "temperature": 0.0, "avg_logprob": -0.2461684544881185, "compression_ratio": 1.6771653543307086, "no_speech_prob": 0.05407633259892464}, {"id": 235, "seek": 85460, "start": 858.84, "end": 864.44, "text": " is very little aside from benchmarks, they do say that it's going to be about five times", "tokens": [50576, 307, 588, 707, 7359, 490, 43751, 11, 436, 360, 584, 300, 309, 311, 516, 281, 312, 466, 1732, 1413, 50856], "temperature": 0.0, "avg_logprob": -0.2461684544881185, "compression_ratio": 1.6771653543307086, "no_speech_prob": 0.05407633259892464}, {"id": 236, "seek": 85460, "start": 864.44, "end": 867.8000000000001, "text": " as expensive as the current opus release.", "tokens": [50856, 382, 5124, 382, 264, 2190, 999, 301, 4374, 13, 51024], "temperature": 0.0, "avg_logprob": -0.2461684544881185, "compression_ratio": 1.6771653543307086, "no_speech_prob": 0.05407633259892464}, {"id": 237, "seek": 85460, "start": 867.8000000000001, "end": 874.6800000000001, "text": " So way, like $25 per million token input, $125 per million token output, very expensive.", "tokens": [51024, 407, 636, 11, 411, 1848, 6074, 680, 2459, 14862, 4846, 11, 1848, 48804, 680, 2459, 14862, 5598, 11, 588, 5124, 13, 51368], "temperature": 0.0, "avg_logprob": -0.2461684544881185, "compression_ratio": 1.6771653543307086, "no_speech_prob": 0.05407633259892464}, {"id": 238, "seek": 85460, "start": 874.6800000000001, "end": 877.84, "text": " I think the most expensive model you can use out there.", "tokens": [51368, 286, 519, 264, 881, 5124, 2316, 291, 393, 764, 484, 456, 13, 51526], "temperature": 0.0, "avg_logprob": -0.2461684544881185, "compression_ratio": 1.6771653543307086, "no_speech_prob": 0.05407633259892464}, {"id": 239, "seek": 85460, "start": 877.84, "end": 883.48, "text": " So that does hint at a much larger model than opus or sonnet.", "tokens": [51526, 407, 300, 775, 12075, 412, 257, 709, 4833, 2316, 813, 999, 301, 420, 1872, 7129, 13, 51808], "temperature": 0.0, "avg_logprob": -0.2461684544881185, "compression_ratio": 1.6771653543307086, "no_speech_prob": 0.05407633259892464}, {"id": 240, "seek": 88348, "start": 883.5600000000001, "end": 890.2, "text": " Other things that we're noting here, they in the in the post actually save at 99% of", "tokens": [50368, 5358, 721, 300, 321, 434, 26801, 510, 11, 436, 294, 264, 294, 264, 2183, 767, 3155, 412, 11803, 4, 295, 50700], "temperature": 0.0, "avg_logprob": -0.21392954312838042, "compression_ratio": 1.8602620087336244, "no_speech_prob": 0.04942889139056206}, {"id": 241, "seek": 88348, "start": 890.2, "end": 893.08, "text": " the vulnerabilities found were not patched.", "tokens": [50700, 264, 37633, 1352, 645, 406, 9972, 292, 13, 50844], "temperature": 0.0, "avg_logprob": -0.21392954312838042, "compression_ratio": 1.8602620087336244, "no_speech_prob": 0.04942889139056206}, {"id": 242, "seek": 88348, "start": 893.08, "end": 899.24, "text": " So they just can't actually tell us what they are because they are currently being patched.", "tokens": [50844, 407, 436, 445, 393, 380, 767, 980, 505, 437, 436, 366, 570, 436, 366, 4362, 885, 9972, 292, 13, 51152], "temperature": 0.0, "avg_logprob": -0.21392954312838042, "compression_ratio": 1.8602620087336244, "no_speech_prob": 0.04942889139056206}, {"id": 243, "seek": 88348, "start": 899.24, "end": 901.5600000000001, "text": " So they only have a couple of examples.", "tokens": [51152, 407, 436, 787, 362, 257, 1916, 295, 5110, 13, 51268], "temperature": 0.0, "avg_logprob": -0.21392954312838042, "compression_ratio": 1.8602620087336244, "no_speech_prob": 0.04942889139056206}, {"id": 244, "seek": 88348, "start": 901.5600000000001, "end": 906.2, "text": " One of them, a couple of them are older patches or older vulnerabilities.", "tokens": [51268, 1485, 295, 552, 11, 257, 1916, 295, 552, 366, 4906, 26531, 420, 4906, 37633, 13, 51500], "temperature": 0.0, "avg_logprob": -0.21392954312838042, "compression_ratio": 1.8602620087336244, "no_speech_prob": 0.04942889139056206}, {"id": 245, "seek": 88348, "start": 906.2, "end": 911.96, "text": " So as you might expect, a lot of these vulnerabilities just have been there for a while and", "tokens": [51500, 407, 382, 291, 1062, 2066, 11, 257, 688, 295, 613, 37633, 445, 362, 668, 456, 337, 257, 1339, 293, 51788], "temperature": 0.0, "avg_logprob": -0.21392954312838042, "compression_ratio": 1.8602620087336244, "no_speech_prob": 0.04942889139056206}, {"id": 246, "seek": 91196, "start": 912.12, "end": 913.32, "text": " just now being discovered.", "tokens": [50372, 445, 586, 885, 6941, 13, 50432], "temperature": 0.0, "avg_logprob": -0.21239859778601844, "compression_ratio": 1.6534296028880866, "no_speech_prob": 0.004980164580047131}, {"id": 247, "seek": 91196, "start": 913.32, "end": 918.76, "text": " And it reminds me actually I saw a post on Twitter from one of them, a Tainers of Linux", "tokens": [50432, 400, 309, 12025, 385, 767, 286, 1866, 257, 2183, 322, 5794, 490, 472, 295, 552, 11, 257, 314, 491, 433, 295, 18734, 50704], "temperature": 0.0, "avg_logprob": -0.21239859778601844, "compression_ratio": 1.6534296028880866, "no_speech_prob": 0.004980164580047131}, {"id": 248, "seek": 91196, "start": 919.4000000000001, "end": 924.84, "text": " or something like Linux saying that they've started seeing more and more kind of real,", "tokens": [50736, 420, 746, 411, 18734, 1566, 300, 436, 600, 1409, 2577, 544, 293, 544, 733, 295, 957, 11, 51008], "temperature": 0.0, "avg_logprob": -0.21239859778601844, "compression_ratio": 1.6534296028880866, "no_speech_prob": 0.004980164580047131}, {"id": 249, "seek": 91196, "start": 924.84, "end": 927.08, "text": " substantive issues come in.", "tokens": [51008, 47113, 2663, 808, 294, 13, 51120], "temperature": 0.0, "avg_logprob": -0.21239859778601844, "compression_ratio": 1.6534296028880866, "no_speech_prob": 0.004980164580047131}, {"id": 250, "seek": 91196, "start": 927.72, "end": 933.48, "text": " And in some ways, it could be good because we are actually going to go through and find all", "tokens": [51152, 400, 294, 512, 2098, 11, 309, 727, 312, 665, 570, 321, 366, 767, 516, 281, 352, 807, 293, 915, 439, 51440], "temperature": 0.0, "avg_logprob": -0.21239859778601844, "compression_ratio": 1.6534296028880866, "no_speech_prob": 0.004980164580047131}, {"id": 251, "seek": 91196, "start": 933.48, "end": 937.24, "text": " the vulnerabilities that just have been there hidden in plain sight.", "tokens": [51440, 264, 37633, 300, 445, 362, 668, 456, 7633, 294, 11121, 7860, 13, 51628], "temperature": 0.0, "avg_logprob": -0.21239859778601844, "compression_ratio": 1.6534296028880866, "no_speech_prob": 0.004980164580047131}, {"id": 252, "seek": 91196, "start": 937.8000000000001, "end": 941.1600000000001, "text": " And perhaps as an attacker, you could already use opus or something", "tokens": [51656, 400, 4317, 382, 364, 35871, 11, 291, 727, 1217, 764, 999, 301, 420, 746, 51824], "temperature": 0.0, "avg_logprob": -0.21239859778601844, "compression_ratio": 1.6534296028880866, "no_speech_prob": 0.004980164580047131}, {"id": 253, "seek": 94116, "start": 941.3199999999999, "end": 944.6, "text": " with much more sophisticated harness to find these.", "tokens": [50372, 365, 709, 544, 16950, 19700, 281, 915, 613, 13, 50536], "temperature": 0.0, "avg_logprob": -0.20727914810180664, "compression_ratio": 1.7291666666666667, "no_speech_prob": 0.025910023599863052}, {"id": 254, "seek": 94116, "start": 944.6, "end": 949.9599999999999, "text": " They do detail a little bit how they set up this exercise.", "tokens": [50536, 814, 360, 2607, 257, 707, 857, 577, 436, 992, 493, 341, 5380, 13, 50804], "temperature": 0.0, "avg_logprob": -0.20727914810180664, "compression_ratio": 1.7291666666666667, "no_speech_prob": 0.025910023599863052}, {"id": 255, "seek": 94116, "start": 949.9599999999999, "end": 954.52, "text": " They have this harness that they have discussed before.", "tokens": [50804, 814, 362, 341, 19700, 300, 436, 362, 7152, 949, 13, 51032], "temperature": 0.0, "avg_logprob": -0.20727914810180664, "compression_ratio": 1.7291666666666667, "no_speech_prob": 0.025910023599863052}, {"id": 256, "seek": 94116, "start": 954.52, "end": 959.3199999999999, "text": " And they have a little container that they launch and they give it a very", "tokens": [51032, 400, 436, 362, 257, 707, 10129, 300, 436, 4025, 293, 436, 976, 309, 257, 588, 51272], "temperature": 0.0, "avg_logprob": -0.20727914810180664, "compression_ratio": 1.7291666666666667, "no_speech_prob": 0.025910023599863052}, {"id": 257, "seek": 94116, "start": 959.3199999999999, "end": 963.88, "text": " curt, like one paragraph instruction to just find vulnerabilities.", "tokens": [51272, 28087, 11, 411, 472, 18865, 10951, 281, 445, 915, 37633, 13, 51500], "temperature": 0.0, "avg_logprob": -0.20727914810180664, "compression_ratio": 1.7291666666666667, "no_speech_prob": 0.025910023599863052}, {"id": 258, "seek": 94116, "start": 963.88, "end": 967.0, "text": " So they don't limit it or give it guard rules or whatever.", "tokens": [51500, 407, 436, 500, 380, 4948, 309, 420, 976, 309, 6290, 4474, 420, 2035, 13, 51656], "temperature": 0.0, "avg_logprob": -0.20727914810180664, "compression_ratio": 1.7291666666666667, "no_speech_prob": 0.025910023599863052}, {"id": 259, "seek": 94116, "start": 967.0, "end": 969.9599999999999, "text": " They just like to go wild and try and hack this.", "tokens": [51656, 814, 445, 411, 281, 352, 4868, 293, 853, 293, 10339, 341, 13, 51804], "temperature": 0.0, "avg_logprob": -0.20727914810180664, "compression_ratio": 1.7291666666666667, "no_speech_prob": 0.025910023599863052}, {"id": 260, "seek": 96996, "start": 970.6800000000001, "end": 978.6800000000001, "text": " And so it's interesting to think through like when will they be able to make the call", "tokens": [50400, 400, 370, 309, 311, 1880, 281, 519, 807, 411, 562, 486, 436, 312, 1075, 281, 652, 264, 818, 50800], "temperature": 0.0, "avg_logprob": -0.16173417227608816, "compression_ratio": 1.69140625, "no_speech_prob": 0.006384721025824547}, {"id": 261, "seek": 96996, "start": 978.6800000000001, "end": 980.6800000000001, "text": " to release this more widely?", "tokens": [50800, 281, 4374, 341, 544, 13371, 30, 50900], "temperature": 0.0, "avg_logprob": -0.16173417227608816, "compression_ratio": 1.69140625, "no_speech_prob": 0.006384721025824547}, {"id": 262, "seek": 96996, "start": 980.6800000000001, "end": 984.9200000000001, "text": " Are they going to have to right now they have this trusted partner research review", "tokens": [50900, 2014, 436, 516, 281, 362, 281, 558, 586, 436, 362, 341, 16034, 4975, 2132, 3131, 51112], "temperature": 0.0, "avg_logprob": -0.16173417227608816, "compression_ratio": 1.69140625, "no_speech_prob": 0.006384721025824547}, {"id": 263, "seek": 96996, "start": 984.9200000000001, "end": 988.52, "text": " where they're working with Vidya and Cisco and all these other big companies?", "tokens": [51112, 689, 436, 434, 1364, 365, 31185, 3016, 293, 38528, 293, 439, 613, 661, 955, 3431, 30, 51292], "temperature": 0.0, "avg_logprob": -0.16173417227608816, "compression_ratio": 1.69140625, "no_speech_prob": 0.006384721025824547}, {"id": 264, "seek": 96996, "start": 989.32, "end": 994.52, "text": " Will that be how access to this level of model be used from now on?", "tokens": [51332, 3099, 300, 312, 577, 2105, 281, 341, 1496, 295, 2316, 312, 1143, 490, 586, 322, 30, 51592], "temperature": 0.0, "avg_logprob": -0.16173417227608816, "compression_ratio": 1.69140625, "no_speech_prob": 0.006384721025824547}, {"id": 265, "seek": 96996, "start": 994.52, "end": 999.8000000000001, "text": " Where you have to be like applying and getting permission to get access to a model VNAPI?", "tokens": [51592, 2305, 291, 362, 281, 312, 411, 9275, 293, 1242, 11226, 281, 483, 2105, 281, 257, 2316, 691, 45, 4715, 40, 30, 51856], "temperature": 0.0, "avg_logprob": -0.16173417227608816, "compression_ratio": 1.69140625, "no_speech_prob": 0.006384721025824547}, {"id": 266, "seek": 99996, "start": 1000.0400000000001, "end": 1005.48, "text": " That is given the level of certification here as you said, not just on the software side,", "tokens": [50368, 663, 307, 2212, 264, 1496, 295, 21775, 510, 382, 291, 848, 11, 406, 445, 322, 264, 4722, 1252, 11, 50640], "temperature": 0.0, "avg_logprob": -0.23521609365204235, "compression_ratio": 1.566820276497696, "no_speech_prob": 0.0023953316267579794}, {"id": 267, "seek": 99996, "start": 1005.48, "end": 1006.76, "text": " but also on the biocide.", "tokens": [50640, 457, 611, 322, 264, 3228, 27791, 13, 50704], "temperature": 0.0, "avg_logprob": -0.23521609365204235, "compression_ratio": 1.566820276497696, "no_speech_prob": 0.0023953316267579794}, {"id": 268, "seek": 99996, "start": 1007.4000000000001, "end": 1014.2800000000001, "text": " Like this is a new realm of capabilities where the safety side is getting very real.", "tokens": [50736, 1743, 341, 307, 257, 777, 15355, 295, 10862, 689, 264, 4514, 1252, 307, 1242, 588, 957, 13, 51080], "temperature": 0.0, "avg_logprob": -0.23521609365204235, "compression_ratio": 1.566820276497696, "no_speech_prob": 0.0023953316267579794}, {"id": 269, "seek": 99996, "start": 1015.08, "end": 1021.1600000000001, "text": " And the kinds of tactics necessary, monitoring may not be sufficient anymore.", "tokens": [51120, 400, 264, 3685, 295, 19454, 4818, 11, 11028, 815, 406, 312, 11563, 3602, 13, 51424], "temperature": 0.0, "avg_logprob": -0.23521609365204235, "compression_ratio": 1.566820276497696, "no_speech_prob": 0.0023953316267579794}, {"id": 270, "seek": 99996, "start": 1021.72, "end": 1025.48, "text": " So very interesting development kind of for the history of AI.", "tokens": [51452, 407, 588, 1880, 3250, 733, 295, 337, 264, 2503, 295, 7318, 13, 51640], "temperature": 0.0, "avg_logprob": -0.23521609365204235, "compression_ratio": 1.566820276497696, "no_speech_prob": 0.0023953316267579794}, {"id": 271, "seek": 102548, "start": 1026.44, "end": 1031.0, "text": " And I wouldn't expect this to go widely available for, you know,", "tokens": [50412, 400, 286, 2759, 380, 2066, 341, 281, 352, 13371, 2435, 337, 11, 291, 458, 11, 50640], "temperature": 0.0, "avg_logprob": -0.24480342136994573, "compression_ratio": 1.6442307692307692, "no_speech_prob": 0.01150408387184143}, {"id": 272, "seek": 102548, "start": 1031.0, "end": 1034.44, "text": " presumably months given the findings they have disclosed.", "tokens": [50640, 26742, 2493, 2212, 264, 16483, 436, 362, 17092, 1744, 13, 50812], "temperature": 0.0, "avg_logprob": -0.24480342136994573, "compression_ratio": 1.6442307692307692, "no_speech_prob": 0.01150408387184143}, {"id": 273, "seek": 102548, "start": 1034.92, "end": 1039.0, "text": " Yeah, the big question at your point, it's also a new development in the history of cyber security,", "tokens": [50836, 865, 11, 264, 955, 1168, 412, 428, 935, 11, 309, 311, 611, 257, 777, 3250, 294, 264, 2503, 295, 13411, 3825, 11, 51040], "temperature": 0.0, "avg_logprob": -0.24480342136994573, "compression_ratio": 1.6442307692307692, "no_speech_prob": 0.01150408387184143}, {"id": 274, "seek": 102548, "start": 1039.0, "end": 1041.64, "text": " right? Everything is AI as AI is the world.", "tokens": [51040, 558, 30, 5471, 307, 7318, 382, 7318, 307, 264, 1002, 13, 51172], "temperature": 0.0, "avg_logprob": -0.24480342136994573, "compression_ratio": 1.6442307692307692, "no_speech_prob": 0.01150408387184143}, {"id": 275, "seek": 102548, "start": 1041.64, "end": 1045.0, "text": " Once it was set of software now it's being set of AI and I think rightly so.", "tokens": [51172, 3443, 309, 390, 992, 295, 4722, 586, 309, 311, 885, 992, 295, 7318, 293, 286, 519, 32879, 370, 13, 51340], "temperature": 0.0, "avg_logprob": -0.24480342136994573, "compression_ratio": 1.6442307692307692, "no_speech_prob": 0.01150408387184143}, {"id": 276, "seek": 102548, "start": 1045.56, "end": 1049.16, "text": " In this case, there's this big question we're going to have to answer for ourselves", "tokens": [51368, 682, 341, 1389, 11, 456, 311, 341, 955, 1168, 321, 434, 516, 281, 362, 281, 1867, 337, 4175, 51548], "temperature": 0.0, "avg_logprob": -0.24480342136994573, "compression_ratio": 1.6442307692307692, "no_speech_prob": 0.01150408387184143}, {"id": 277, "seek": 102548, "start": 1049.16, "end": 1050.3600000000001, "text": " as civilization.", "tokens": [51548, 382, 18036, 13, 51608], "temperature": 0.0, "avg_logprob": -0.24480342136994573, "compression_ratio": 1.6442307692307692, "no_speech_prob": 0.01150408387184143}, {"id": 278, "seek": 102548, "start": 1050.3600000000001, "end": 1053.08, "text": " And that has to do with the offense defense balance in cyber, right?", "tokens": [51608, 400, 300, 575, 281, 360, 365, 264, 17834, 7654, 4772, 294, 13411, 11, 558, 30, 51744], "temperature": 0.0, "avg_logprob": -0.24480342136994573, "compression_ratio": 1.6442307692307692, "no_speech_prob": 0.01150408387184143}, {"id": 279, "seek": 105308, "start": 1053.6399999999999, "end": 1058.36, "text": " Is it the case that a more powerful model, just in general, more powerfully AI models being", "tokens": [50392, 1119, 309, 264, 1389, 300, 257, 544, 4005, 2316, 11, 445, 294, 2674, 11, 544, 1347, 2277, 7318, 5245, 885, 50628], "temperature": 0.0, "avg_logprob": -0.14678548076960046, "compression_ratio": 1.6973684210526316, "no_speech_prob": 0.01168410386890173}, {"id": 280, "seek": 105308, "start": 1058.36, "end": 1063.24, "text": " broadly available? Does that lead to a disproportionate advantage for cyber attackers or for", "tokens": [50628, 19511, 2435, 30, 4402, 300, 1477, 281, 257, 28734, 473, 5002, 337, 13411, 45129, 420, 337, 50872], "temperature": 0.0, "avg_logprob": -0.14678548076960046, "compression_ratio": 1.6973684210526316, "no_speech_prob": 0.01168410386890173}, {"id": 281, "seek": 105308, "start": 1063.24, "end": 1068.1999999999998, "text": " cyber defenders? And for a really long time, the argument was that you really couldn't know.", "tokens": [50872, 13411, 36063, 30, 400, 337, 257, 534, 938, 565, 11, 264, 6770, 390, 300, 291, 534, 2809, 380, 458, 13, 51120], "temperature": 0.0, "avg_logprob": -0.14678548076960046, "compression_ratio": 1.6973684210526316, "no_speech_prob": 0.01168410386890173}, {"id": 282, "seek": 105308, "start": 1068.1999999999998, "end": 1072.36, "text": " And this is a, I remember having a lot of like kind of half-drunk arguments with a lot of people", "tokens": [51120, 400, 341, 307, 257, 11, 286, 1604, 1419, 257, 688, 295, 411, 733, 295, 1922, 12, 16753, 3197, 12869, 365, 257, 688, 295, 561, 51328], "temperature": 0.0, "avg_logprob": -0.14678548076960046, "compression_ratio": 1.6973684210526316, "no_speech_prob": 0.01168410386890173}, {"id": 283, "seek": 105308, "start": 1072.36, "end": 1077.6399999999999, "text": " about this three, four, five years ago. I think it's largely unchanged from what it was back then.", "tokens": [51328, 466, 341, 1045, 11, 1451, 11, 1732, 924, 2057, 13, 286, 519, 309, 311, 11611, 44553, 490, 437, 309, 390, 646, 550, 13, 51592], "temperature": 0.0, "avg_logprob": -0.14678548076960046, "compression_ratio": 1.6973684210526316, "no_speech_prob": 0.01168410386890173}, {"id": 284, "seek": 105308, "start": 1077.6399999999999, "end": 1080.04, "text": " I just think the attack surface is so big.", "tokens": [51592, 286, 445, 519, 264, 2690, 3753, 307, 370, 955, 13, 51712], "temperature": 0.0, "avg_logprob": -0.14678548076960046, "compression_ratio": 1.6973684210526316, "no_speech_prob": 0.01168410386890173}, {"id": 285, "seek": 108004, "start": 1080.12, "end": 1085.08, "text": " One way you can think of this is it's compute on compute warfare, right? So you have a certain", "tokens": [50368, 1485, 636, 291, 393, 519, 295, 341, 307, 309, 311, 14722, 322, 14722, 24490, 11, 558, 30, 407, 291, 362, 257, 1629, 50616], "temperature": 0.0, "avg_logprob": -0.1567763596129932, "compression_ratio": 1.9218241042345277, "no_speech_prob": 0.009856943972408772}, {"id": 286, "seek": 108004, "start": 1085.08, "end": 1089.24, "text": " amount of inference compute that you can afford to spend perusing your code base and securing it", "tokens": [50616, 2372, 295, 38253, 14722, 300, 291, 393, 6157, 281, 3496, 680, 7981, 428, 3089, 3096, 293, 33640, 309, 50824], "temperature": 0.0, "avg_logprob": -0.1567763596129932, "compression_ratio": 1.9218241042345277, "no_speech_prob": 0.009856943972408772}, {"id": 287, "seek": 108004, "start": 1089.24, "end": 1093.32, "text": " as well as you can. An attacker has a certain amount of compute they can afford to peruse your", "tokens": [50824, 382, 731, 382, 291, 393, 13, 1107, 35871, 575, 257, 1629, 2372, 295, 14722, 436, 393, 6157, 281, 680, 438, 428, 51028], "temperature": 0.0, "avg_logprob": -0.1567763596129932, "compression_ratio": 1.9218241042345277, "no_speech_prob": 0.009856943972408772}, {"id": 288, "seek": 108004, "start": 1093.32, "end": 1098.28, "text": " code base or whatever external surfaces they can access to find vulnerabilities. There's going to be", "tokens": [51028, 3089, 3096, 420, 2035, 8320, 16130, 436, 393, 2105, 281, 915, 37633, 13, 821, 311, 516, 281, 312, 51276], "temperature": 0.0, "avg_logprob": -0.1567763596129932, "compression_ratio": 1.9218241042345277, "no_speech_prob": 0.009856943972408772}, {"id": 289, "seek": 108004, "start": 1098.28, "end": 1102.04, "text": " very roughly, and this is going to be wrong in a whole bunch of, you know, specific ways, but very", "tokens": [51276, 588, 9810, 11, 293, 341, 307, 516, 281, 312, 2085, 294, 257, 1379, 3840, 295, 11, 291, 458, 11, 2685, 2098, 11, 457, 588, 51464], "temperature": 0.0, "avg_logprob": -0.1567763596129932, "compression_ratio": 1.9218241042345277, "no_speech_prob": 0.009856943972408772}, {"id": 290, "seek": 108004, "start": 1102.04, "end": 1106.2, "text": " roughly you're trading off differently leveraged pots of compute and, you know, maybe you have a two to", "tokens": [51464, 9810, 291, 434, 9529, 766, 7614, 12451, 2980, 22022, 295, 14722, 293, 11, 291, 458, 11, 1310, 291, 362, 257, 732, 281, 51672], "temperature": 0.0, "avg_logprob": -0.1567763596129932, "compression_ratio": 1.9218241042345277, "no_speech_prob": 0.009856943972408772}, {"id": 291, "seek": 110620, "start": 1106.2, "end": 1111.48, "text": " one leverage advantage or whatever, but ultimately if you're defending, you have a huge attack surface.", "tokens": [50364, 472, 13982, 5002, 420, 2035, 11, 457, 6284, 498, 291, 434, 21377, 11, 291, 362, 257, 2603, 2690, 3753, 13, 50628], "temperature": 0.0, "avg_logprob": -0.10706840045210245, "compression_ratio": 1.7286135693215339, "no_speech_prob": 0.0008295370498672128}, {"id": 292, "seek": 110620, "start": 1111.48, "end": 1115.48, "text": " And if you're attacking, you can kind of march divided and fight concentrated, like you can", "tokens": [50628, 400, 498, 291, 434, 15010, 11, 291, 393, 733, 295, 8368, 6666, 293, 2092, 21321, 11, 411, 291, 393, 50828], "temperature": 0.0, "avg_logprob": -0.10706840045210245, "compression_ratio": 1.7286135693215339, "no_speech_prob": 0.0008295370498672128}, {"id": 293, "seek": 110620, "start": 1115.48, "end": 1121.0800000000002, "text": " constrict all your efforts on just like one tiny component that, you know, maybe the defender has", "tokens": [50828, 1817, 3740, 439, 428, 6484, 322, 445, 411, 472, 5870, 6542, 300, 11, 291, 458, 11, 1310, 264, 26537, 575, 51108], "temperature": 0.0, "avg_logprob": -0.10706840045210245, "compression_ratio": 1.7286135693215339, "no_speech_prob": 0.0008295370498672128}, {"id": 294, "seek": 110620, "start": 1121.0800000000002, "end": 1126.44, "text": " not been able to invest as much inference time it computed into securing. So I don't know, but", "tokens": [51108, 406, 668, 1075, 281, 1963, 382, 709, 38253, 565, 309, 40610, 666, 33640, 13, 407, 286, 500, 380, 458, 11, 457, 51376], "temperature": 0.0, "avg_logprob": -0.10706840045210245, "compression_ratio": 1.7286135693215339, "no_speech_prob": 0.0008295370498672128}, {"id": 295, "seek": 110620, "start": 1126.44, "end": 1130.92, "text": " this is certainly one way this could go. A way anthropic is trying to help the defensive side here", "tokens": [51376, 341, 307, 3297, 472, 636, 341, 727, 352, 13, 316, 636, 22727, 299, 307, 1382, 281, 854, 264, 16468, 1252, 510, 51600], "temperature": 0.0, "avg_logprob": -0.10706840045210245, "compression_ratio": 1.7286135693215339, "no_speech_prob": 0.0008295370498672128}, {"id": 296, "seek": 110620, "start": 1130.92, "end": 1135.4, "text": " is, as you say, by delaying the broader release of this tool. So hopefully people are going to run", "tokens": [51600, 307, 11, 382, 291, 584, 11, 538, 8577, 278, 264, 13227, 4374, 295, 341, 2290, 13, 407, 4696, 561, 366, 516, 281, 1190, 51824], "temperature": 0.0, "avg_logprob": -0.10706840045210245, "compression_ratio": 1.7286135693215339, "no_speech_prob": 0.0008295370498672128}, {"id": 297, "seek": 113540, "start": 1135.4, "end": 1139.24, "text": " around and patch as much as they can. This is part of the challenge, right? It's like, what does it", "tokens": [50364, 926, 293, 9972, 382, 709, 382, 436, 393, 13, 639, 307, 644, 295, 264, 3430, 11, 558, 30, 467, 311, 411, 11, 437, 775, 309, 50556], "temperature": 0.0, "avg_logprob": -0.1488541264687815, "compression_ratio": 1.7323529411764707, "no_speech_prob": 0.0010004600044339895}, {"id": 298, "seek": 113540, "start": 1139.24, "end": 1144.1200000000001, "text": " actually mean for anthropic to be holding on to this model? Who actually has access to it? We", "tokens": [50556, 767, 914, 337, 22727, 299, 281, 312, 5061, 322, 281, 341, 2316, 30, 2102, 767, 575, 2105, 281, 309, 30, 492, 50800], "temperature": 0.0, "avg_logprob": -0.1488541264687815, "compression_ratio": 1.7323529411764707, "no_speech_prob": 0.0010004600044339895}, {"id": 299, "seek": 113540, "start": 1144.1200000000001, "end": 1148.3600000000001, "text": " argued in that report like a year or a year and a half ago that it's a leaky bucket situation for", "tokens": [50800, 20219, 294, 300, 2275, 411, 257, 1064, 420, 257, 1064, 293, 257, 1922, 2057, 300, 309, 311, 257, 476, 15681, 13058, 2590, 337, 51012], "temperature": 0.0, "avg_logprob": -0.1488541264687815, "compression_ratio": 1.7323529411764707, "no_speech_prob": 0.0010004600044339895}, {"id": 300, "seek": 113540, "start": 1148.3600000000001, "end": 1152.68, "text": " whole host of reasons, you know, if that remains true, then you can do the math. I mean, it may", "tokens": [51012, 1379, 3975, 295, 4112, 11, 291, 458, 11, 498, 300, 7023, 2074, 11, 550, 291, 393, 360, 264, 5221, 13, 286, 914, 11, 309, 815, 51228], "temperature": 0.0, "avg_logprob": -0.1488541264687815, "compression_ratio": 1.7323529411764707, "no_speech_prob": 0.0010004600044339895}, {"id": 301, "seek": 113540, "start": 1152.68, "end": 1157.5600000000002, "text": " well be the case that this model has in some sense proliferated, or it may not, but anyway, all kinds", "tokens": [51228, 731, 312, 264, 1389, 300, 341, 2316, 575, 294, 512, 2020, 24398, 9361, 770, 11, 420, 309, 815, 406, 11, 457, 4033, 11, 439, 3685, 51472], "temperature": 0.0, "avg_logprob": -0.1488541264687815, "compression_ratio": 1.7323529411764707, "no_speech_prob": 0.0010004600044339895}, {"id": 302, "seek": 113540, "start": 1157.5600000000002, "end": 1162.0400000000002, "text": " of considerations in the mix here. This is, I think the most important story of the last two weeks,", "tokens": [51472, 295, 24070, 294, 264, 2890, 510, 13, 639, 307, 11, 286, 519, 264, 881, 1021, 1657, 295, 264, 1036, 732, 3259, 11, 51696], "temperature": 0.0, "avg_logprob": -0.1488541264687815, "compression_ratio": 1.7323529411764707, "no_speech_prob": 0.0010004600044339895}, {"id": 303, "seek": 116204, "start": 1162.04, "end": 1164.92, "text": " and it just dropped into our lap yesterday. I want to say yesterday.", "tokens": [50364, 293, 309, 445, 8119, 666, 527, 13214, 5186, 13, 286, 528, 281, 584, 5186, 13, 50508], "temperature": 0.0, "avg_logprob": -0.27964757737659274, "compression_ratio": 1.5418502202643172, "no_speech_prob": 0.007456600666046143}, {"id": 304, "seek": 116204, "start": 1165.48, "end": 1173.3999999999999, "text": " Well, ironically, actually, like two weeks ago, the existence of this model on different projects,", "tokens": [50536, 1042, 11, 41082, 11, 767, 11, 411, 732, 3259, 2057, 11, 264, 9123, 295, 341, 2316, 322, 819, 4455, 11, 50932], "temperature": 0.0, "avg_logprob": -0.27964757737659274, "compression_ratio": 1.5418502202643172, "no_speech_prob": 0.007456600666046143}, {"id": 305, "seek": 116204, "start": 1173.3999999999999, "end": 1182.44, "text": " under the term mythos was leaked. So the blog posts on anthropic websites were accidentally", "tokens": [50932, 833, 264, 1433, 9474, 329, 390, 31779, 13, 407, 264, 6968, 12300, 322, 22727, 299, 12891, 645, 15715, 51384], "temperature": 0.0, "avg_logprob": -0.27964757737659274, "compression_ratio": 1.5418502202643172, "no_speech_prob": 0.007456600666046143}, {"id": 306, "seek": 116204, "start": 1183.08, "end": 1189.56, "text": " left kind of publicly accessible via some sort of caching thing. So if I was even to hack,", "tokens": [51416, 1411, 733, 295, 14843, 9515, 5766, 512, 1333, 295, 269, 2834, 551, 13, 407, 498, 286, 390, 754, 281, 10339, 11, 51740], "temperature": 0.0, "avg_logprob": -0.27964757737659274, "compression_ratio": 1.5418502202643172, "no_speech_prob": 0.007456600666046143}, {"id": 307, "seek": 118956, "start": 1189.56, "end": 1195.1599999999999, "text": " it was like basically someone messed up a little bit, and if you were digging around, you could find", "tokens": [50364, 309, 390, 411, 1936, 1580, 16507, 493, 257, 707, 857, 11, 293, 498, 291, 645, 17343, 926, 11, 291, 727, 915, 50644], "temperature": 0.0, "avg_logprob": -0.21789950512825174, "compression_ratio": 1.606694560669456, "no_speech_prob": 0.01588488183915615}, {"id": 308, "seek": 118956, "start": 1195.1599999999999, "end": 1200.84, "text": " these draft blog posts that alluded to mythos described it as they advanced. Also, there was", "tokens": [50644, 613, 11206, 6968, 12300, 300, 33919, 281, 9474, 329, 7619, 309, 382, 436, 7339, 13, 2743, 11, 456, 390, 50928], "temperature": 0.0, "avg_logprob": -0.21789950512825174, "compression_ratio": 1.606694560669456, "no_speech_prob": 0.01588488183915615}, {"id": 309, "seek": 118956, "start": 1201.56, "end": 1207.56, "text": " something about an AM model called Capibara. Unclear favor, like deciding between mythos and", "tokens": [50964, 746, 466, 364, 6475, 2316, 1219, 8363, 897, 2419, 13, 1156, 43679, 2294, 11, 411, 17990, 1296, 9474, 329, 293, 51264], "temperature": 0.0, "avg_logprob": -0.21789950512825174, "compression_ratio": 1.606694560669456, "no_speech_prob": 0.01588488183915615}, {"id": 310, "seek": 118956, "start": 1207.56, "end": 1213.72, "text": " Capibara. Either way, these are described as kind of the next step beyond opus, which are bigger.", "tokens": [51264, 8363, 897, 2419, 13, 13746, 636, 11, 613, 366, 7619, 382, 733, 295, 264, 958, 1823, 4399, 999, 301, 11, 597, 366, 3801, 13, 51572], "temperature": 0.0, "avg_logprob": -0.21789950512825174, "compression_ratio": 1.606694560669456, "no_speech_prob": 0.01588488183915615}, {"id": 311, "seek": 121372, "start": 1214.1200000000001, "end": 1221.48, "text": " Another interesting angle of this is we haven't seen bigger models that we have been aware of for a while.", "tokens": [50384, 3996, 1880, 5802, 295, 341, 307, 321, 2378, 380, 1612, 3801, 5245, 300, 321, 362, 668, 3650, 295, 337, 257, 1339, 13, 50752], "temperature": 0.0, "avg_logprob": -0.25017416597616793, "compression_ratio": 1.5934959349593496, "no_speech_prob": 0.1416449099779129}, {"id": 312, "seek": 121372, "start": 1221.48, "end": 1228.44, "text": " The last time was GPT. I forget what was the massive model that openly, I think 4.5, they launched", "tokens": [50752, 440, 1036, 565, 390, 26039, 51, 13, 286, 2870, 437, 390, 264, 5994, 2316, 300, 23109, 11, 286, 519, 1017, 13, 20, 11, 436, 8730, 51100], "temperature": 0.0, "avg_logprob": -0.25017416597616793, "compression_ratio": 1.5934959349593496, "no_speech_prob": 0.1416449099779129}, {"id": 313, "seek": 121372, "start": 1228.44, "end": 1235.56, "text": " it and they kind of killed it. They, because it was a very, they expensive model. I believe it was", "tokens": [51100, 309, 293, 436, 733, 295, 4652, 309, 13, 814, 11, 570, 309, 390, 257, 588, 11, 436, 5124, 2316, 13, 286, 1697, 309, 390, 51456], "temperature": 0.0, "avg_logprob": -0.25017416597616793, "compression_ratio": 1.5934959349593496, "no_speech_prob": 0.1416449099779129}, {"id": 314, "seek": 121372, "start": 1235.56, "end": 1241.16, "text": " very charging $125 or something like that. At the time, people basically were thinking,", "tokens": [51456, 588, 11379, 1848, 48804, 420, 746, 411, 300, 13, 1711, 264, 565, 11, 561, 1936, 645, 1953, 11, 51736], "temperature": 0.0, "avg_logprob": -0.25017416597616793, "compression_ratio": 1.5934959349593496, "no_speech_prob": 0.1416449099779129}, {"id": 315, "seek": 124116, "start": 1241.16, "end": 1249.72, "text": " this is the 10 billion parameter model, whatever, it was sort of positioned as, oh, this is so smart,", "tokens": [50364, 341, 307, 264, 1266, 5218, 13075, 2316, 11, 2035, 11, 309, 390, 1333, 295, 24889, 382, 11, 1954, 11, 341, 307, 370, 4069, 11, 50792], "temperature": 0.0, "avg_logprob": -0.1590059430975663, "compression_ratio": 1.6555555555555554, "no_speech_prob": 0.007686054799705744}, {"id": 316, "seek": 124116, "start": 1249.72, "end": 1257.3200000000002, "text": " it has this flavor of being smart. But in practice, it didn't seem like it was capable of much more", "tokens": [50792, 309, 575, 341, 6813, 295, 885, 4069, 13, 583, 294, 3124, 11, 309, 994, 380, 1643, 411, 309, 390, 8189, 295, 709, 544, 51172], "temperature": 0.0, "avg_logprob": -0.1590059430975663, "compression_ratio": 1.6555555555555554, "no_speech_prob": 0.007686054799705744}, {"id": 317, "seek": 124116, "start": 1257.3200000000002, "end": 1264.44, "text": " than at the time smaller models, like 1 billion, 2 billion parameter models. So this is a return", "tokens": [51172, 813, 412, 264, 565, 4356, 5245, 11, 411, 502, 5218, 11, 568, 5218, 13075, 5245, 13, 407, 341, 307, 257, 2736, 51528], "temperature": 0.0, "avg_logprob": -0.1590059430975663, "compression_ratio": 1.6555555555555554, "no_speech_prob": 0.007686054799705744}, {"id": 318, "seek": 126444, "start": 1265.16, "end": 1272.28, "text": " seemingly to being able to scale up a parameter count effectively. And I'm sure it's driven by", "tokens": [50400, 18709, 281, 885, 1075, 281, 4373, 493, 257, 13075, 1207, 8659, 13, 400, 286, 478, 988, 309, 311, 9555, 538, 50756], "temperature": 0.0, "avg_logprob": -0.21578142719884072, "compression_ratio": 1.5179282868525896, "no_speech_prob": 0.1094001978635788}, {"id": 319, "seek": 126444, "start": 1272.28, "end": 1280.28, "text": " many things, including additional data from Cloud Code and V-Sings that aren't searchable via the web.", "tokens": [50756, 867, 721, 11, 3009, 4497, 1412, 490, 8061, 15549, 293, 691, 12, 50, 1109, 300, 3212, 380, 3164, 712, 5766, 264, 3670, 13, 51156], "temperature": 0.0, "avg_logprob": -0.21578142719884072, "compression_ratio": 1.5179282868525896, "no_speech_prob": 0.1094001978635788}, {"id": 320, "seek": 126444, "start": 1280.28, "end": 1284.6000000000001, "text": " And beyond that, also the progress in reinforcement learning that we've been seeing.", "tokens": [51156, 400, 4399, 300, 11, 611, 264, 4205, 294, 29280, 2539, 300, 321, 600, 668, 2577, 13, 51372], "temperature": 0.0, "avg_logprob": -0.21578142719884072, "compression_ratio": 1.5179282868525896, "no_speech_prob": 0.1094001978635788}, {"id": 321, "seek": 126444, "start": 1285.4, "end": 1292.6000000000001, "text": " Alrighty, well, moving on to let's say lower impact news. Next up, you've got Google and they have", "tokens": [51412, 43301, 11, 731, 11, 2684, 322, 281, 718, 311, 584, 3126, 2712, 2583, 13, 3087, 493, 11, 291, 600, 658, 3329, 293, 436, 362, 51772], "temperature": 0.0, "avg_logprob": -0.21578142719884072, "compression_ratio": 1.5179282868525896, "no_speech_prob": 0.1094001978635788}, {"id": 322, "seek": 129260, "start": 1292.6, "end": 1303.0, "text": " an update to Gemini Live, they're releasing Gemini 3.1 Flash Live, which is their audio and voice", "tokens": [50364, 364, 5623, 281, 22894, 3812, 10385, 11, 436, 434, 16327, 22894, 3812, 805, 13, 16, 20232, 10385, 11, 597, 307, 641, 6278, 293, 3177, 50884], "temperature": 0.0, "avg_logprob": -0.11076137165964385, "compression_ratio": 1.4439024390243902, "no_speech_prob": 0.009259959682822227}, {"id": 323, "seek": 129260, "start": 1303.0, "end": 1312.84, "text": " model. So this allows you to talk to AI. It's kind of a real-time chat. And it's a pretty big jump", "tokens": [50884, 2316, 13, 407, 341, 4045, 291, 281, 751, 281, 7318, 13, 467, 311, 733, 295, 257, 957, 12, 3766, 5081, 13, 400, 309, 311, 257, 1238, 955, 3012, 51376], "temperature": 0.0, "avg_logprob": -0.11076137165964385, "compression_ratio": 1.4439024390243902, "no_speech_prob": 0.009259959682822227}, {"id": 324, "seek": 129260, "start": 1312.84, "end": 1320.6799999999998, "text": " over the predecessor, which was 2.5 Flash native audio. This has low latency, better recognition of", "tokens": [51376, 670, 264, 34991, 11, 597, 390, 568, 13, 20, 20232, 8470, 6278, 13, 639, 575, 2295, 27043, 11, 1101, 11150, 295, 51768], "temperature": 0.0, "avg_logprob": -0.11076137165964385, "compression_ratio": 1.4439024390243902, "no_speech_prob": 0.009259959682822227}, {"id": 325, "seek": 132068, "start": 1320.68, "end": 1326.44, "text": " speech, et cetera, et cetera. It has over 90 languages supported for real-time, multi-modal", "tokens": [50364, 6218, 11, 1030, 11458, 11, 1030, 11458, 13, 467, 575, 670, 4289, 8650, 8104, 337, 957, 12, 3766, 11, 4825, 12, 8014, 304, 50652], "temperature": 0.0, "avg_logprob": -0.12720953249463848, "compression_ratio": 1.5857740585774058, "no_speech_prob": 0.010644501075148582}, {"id": 326, "seek": 132068, "start": 1326.44, "end": 1333.5600000000002, "text": " conversation. And this is notable, I think, because compared to just LLMs, the ability to do this", "tokens": [50652, 3761, 13, 400, 341, 307, 22556, 11, 286, 519, 11, 570, 5347, 281, 445, 441, 43, 26386, 11, 264, 3485, 281, 360, 341, 51008], "temperature": 0.0, "avg_logprob": -0.12720953249463848, "compression_ratio": 1.5857740585774058, "no_speech_prob": 0.010644501075148582}, {"id": 327, "seek": 132068, "start": 1333.5600000000002, "end": 1341.0800000000002, "text": " kind of real-time, conversational AI is not something where you have as many options to go with.", "tokens": [51008, 733, 295, 957, 12, 3766, 11, 2615, 1478, 7318, 307, 406, 746, 689, 291, 362, 382, 867, 3956, 281, 352, 365, 13, 51384], "temperature": 0.0, "avg_logprob": -0.12720953249463848, "compression_ratio": 1.5857740585774058, "no_speech_prob": 0.010644501075148582}, {"id": 328, "seek": 132068, "start": 1341.0800000000002, "end": 1347.5600000000002, "text": " So if you want to build a chatbot where you can talk to it, that's harder for you when it is", "tokens": [51384, 407, 498, 291, 528, 281, 1322, 257, 5081, 18870, 689, 291, 393, 751, 281, 309, 11, 300, 311, 6081, 337, 291, 562, 309, 307, 51708], "temperature": 0.0, "avg_logprob": -0.12720953249463848, "compression_ratio": 1.5857740585774058, "no_speech_prob": 0.010644501075148582}, {"id": 329, "seek": 134756, "start": 1347.56, "end": 1355.56, "text": " for OpenAI or Google. With a very powerful API for this, we could see more players out there", "tokens": [50364, 337, 7238, 48698, 420, 3329, 13, 2022, 257, 588, 4005, 9362, 337, 341, 11, 321, 727, 536, 544, 4150, 484, 456, 50764], "temperature": 0.0, "avg_logprob": -0.08509170248153362, "compression_ratio": 1.603448275862069, "no_speech_prob": 0.019396161660552025}, {"id": 330, "seek": 134756, "start": 1356.2, "end": 1364.12, "text": " building out this interface of voice into AI, which has seemed to become more of a norm.", "tokens": [50796, 2390, 484, 341, 9226, 295, 3177, 666, 7318, 11, 597, 575, 6576, 281, 1813, 544, 295, 257, 2026, 13, 51192], "temperature": 0.0, "avg_logprob": -0.08509170248153362, "compression_ratio": 1.603448275862069, "no_speech_prob": 0.019396161660552025}, {"id": 331, "seek": 134756, "start": 1364.6799999999998, "end": 1370.6, "text": " I still don't do it, but my impression is talking to AI is going to become more and more normal.", "tokens": [51220, 286, 920, 500, 380, 360, 309, 11, 457, 452, 9995, 307, 1417, 281, 7318, 307, 516, 281, 1813, 544, 293, 544, 2710, 13, 51516], "temperature": 0.0, "avg_logprob": -0.08509170248153362, "compression_ratio": 1.603448275862069, "no_speech_prob": 0.019396161660552025}, {"id": 332, "seek": 134756, "start": 1371.32, "end": 1376.52, "text": " And this will be one of the drivers of it, like having an easy way to build that for whatever", "tokens": [51552, 400, 341, 486, 312, 472, 295, 264, 11590, 295, 309, 11, 411, 1419, 364, 1858, 636, 281, 1322, 300, 337, 2035, 51812], "temperature": 0.0, "avg_logprob": -0.08509170248153362, "compression_ratio": 1.603448275862069, "no_speech_prob": 0.019396161660552025}, {"id": 333, "seek": 137652, "start": 1376.52, "end": 1381.08, "text": " application you have in mind. Yeah, it's also one of the big structural", "tokens": [50364, 3861, 291, 362, 294, 1575, 13, 865, 11, 309, 311, 611, 472, 295, 264, 955, 15067, 50592], "temperature": 0.0, "avg_logprob": -0.18158421796910904, "compression_ratio": 1.703125, "no_speech_prob": 0.002018651459366083}, {"id": 334, "seek": 137652, "start": 1381.08, "end": 1385.72, "text": " advantages that Google has is they've kind of maintained their lead on multi-modality.", "tokens": [50592, 14906, 300, 3329, 575, 307, 436, 600, 733, 295, 17578, 641, 1477, 322, 4825, 12, 8014, 1860, 13, 50824], "temperature": 0.0, "avg_logprob": -0.18158421796910904, "compression_ratio": 1.703125, "no_speech_prob": 0.002018651459366083}, {"id": 335, "seek": 137652, "start": 1385.72, "end": 1390.28, "text": " I mean, alongside OpenAI, this is really one of the areas that Google started to differentiate", "tokens": [50824, 286, 914, 11, 12385, 7238, 48698, 11, 341, 307, 534, 472, 295, 264, 3179, 300, 3329, 1409, 281, 23203, 51052], "temperature": 0.0, "avg_logprob": -0.18158421796910904, "compression_ratio": 1.703125, "no_speech_prob": 0.002018651459366083}, {"id": 336, "seek": 137652, "start": 1390.28, "end": 1395.16, "text": " itself. The starting is far back as, oh God, what was it got it, right? Like, multi-modality has", "tokens": [51052, 2564, 13, 440, 2891, 307, 1400, 646, 382, 11, 1954, 1265, 11, 437, 390, 309, 658, 309, 11, 558, 30, 1743, 11, 4825, 12, 8014, 1860, 575, 51296], "temperature": 0.0, "avg_logprob": -0.18158421796910904, "compression_ratio": 1.703125, "no_speech_prob": 0.002018651459366083}, {"id": 337, "seek": 137652, "start": 1395.16, "end": 1400.2, "text": " been their big play, this idea of positive transfer. And so not surprising that they're at the", "tokens": [51296, 668, 641, 955, 862, 11, 341, 1558, 295, 3353, 5003, 13, 400, 370, 406, 8830, 300, 436, 434, 412, 264, 51548], "temperature": 0.0, "avg_logprob": -0.18158421796910904, "compression_ratio": 1.703125, "no_speech_prob": 0.002018651459366083}, {"id": 338, "seek": 137652, "start": 1400.2, "end": 1405.72, "text": " gate leading yet again on especially the API side of things that is going to be, if you're going to", "tokens": [51548, 8539, 5775, 1939, 797, 322, 2318, 264, 9362, 1252, 295, 721, 300, 307, 516, 281, 312, 11, 498, 291, 434, 516, 281, 51824], "temperature": 0.0, "avg_logprob": -0.18158421796910904, "compression_ratio": 1.703125, "no_speech_prob": 0.002018651459366083}, {"id": 339, "seek": 140572, "start": 1405.72, "end": 1410.84, "text": " build using these modalities, like this is looking like a pretty strong default option right now.", "tokens": [50364, 1322, 1228, 613, 1072, 16110, 11, 411, 341, 307, 1237, 411, 257, 1238, 2068, 7576, 3614, 558, 586, 13, 50620], "temperature": 0.0, "avg_logprob": -0.21127377081354823, "compression_ratio": 1.6, "no_speech_prob": 0.00690127769485116}, {"id": 340, "seek": 140572, "start": 1410.84, "end": 1415.08, "text": " So yeah, we're really interesting move and we'll see if they can maintain that lead too.", "tokens": [50620, 407, 1338, 11, 321, 434, 534, 1880, 1286, 293, 321, 603, 536, 498, 436, 393, 6909, 300, 1477, 886, 13, 50832], "temperature": 0.0, "avg_logprob": -0.21127377081354823, "compression_ratio": 1.6, "no_speech_prob": 0.00690127769485116}, {"id": 341, "seek": 140572, "start": 1415.08, "end": 1419.08, "text": " Because other labs will be pushing that direction. At a certain point, you're going to see a", "tokens": [50832, 1436, 661, 20339, 486, 312, 7380, 300, 3513, 13, 1711, 257, 1629, 935, 11, 291, 434, 516, 281, 536, 257, 51032], "temperature": 0.0, "avg_logprob": -0.21127377081354823, "compression_ratio": 1.6, "no_speech_prob": 0.00690127769485116}, {"id": 342, "seek": 140572, "start": 1419.08, "end": 1424.28, "text": " land grab and everybody's bleeding into each other's domains. Next up, another sort of low", "tokens": [51032, 2117, 4444, 293, 2201, 311, 19312, 666, 1184, 661, 311, 25514, 13, 3087, 493, 11, 1071, 1333, 295, 2295, 51292], "temperature": 0.0, "avg_logprob": -0.21127377081354823, "compression_ratio": 1.6, "no_speech_prob": 0.00690127769485116}, {"id": 343, "seek": 140572, "start": 1424.28, "end": 1429.8, "text": " impact story on FropPick has announced that cloud code subscribers will need to pay extra for", "tokens": [51292, 2712, 1657, 322, 1526, 404, 47, 618, 575, 7548, 300, 4588, 3089, 11092, 486, 643, 281, 1689, 2857, 337, 51568], "temperature": 0.0, "avg_logprob": -0.21127377081354823, "compression_ratio": 1.6, "no_speech_prob": 0.00690127769485116}, {"id": 344, "seek": 142980, "start": 1429.8799999999999, "end": 1437.08, "text": " OpenClaw usage. This is kind of in line with hosted developments around access to a", "tokens": [50368, 7238, 34, 5901, 14924, 13, 639, 307, 733, 295, 294, 1622, 365, 19204, 20862, 926, 2105, 281, 257, 50728], "temperature": 0.0, "avg_logprob": -0.16109355636264966, "compression_ratio": 1.5450643776824033, "no_speech_prob": 0.005640080664306879}, {"id": 345, "seek": 142980, "start": 1437.08, "end": 1444.2, "text": " cloud code. I believe earlier, we were also other restrictions on sort of harness access.", "tokens": [50728, 4588, 3089, 13, 286, 1697, 3071, 11, 321, 645, 611, 661, 14191, 322, 1333, 295, 19700, 2105, 13, 51084], "temperature": 0.0, "avg_logprob": -0.16109355636264966, "compression_ratio": 1.5450643776824033, "no_speech_prob": 0.005640080664306879}, {"id": 346, "seek": 142980, "start": 1444.2, "end": 1449.8, "text": " So just as if you're paying for a subscription access of like $20 per month, $200 per month,", "tokens": [51084, 407, 445, 382, 498, 291, 434, 6229, 337, 257, 17231, 2105, 295, 411, 1848, 2009, 680, 1618, 11, 1848, 7629, 680, 1618, 11, 51364], "temperature": 0.0, "avg_logprob": -0.16109355636264966, "compression_ratio": 1.5450643776824033, "no_speech_prob": 0.005640080664306879}, {"id": 347, "seek": 142980, "start": 1450.36, "end": 1457.48, "text": " it used to be that you could use that to power up a non-cloud code application like OpenClaw.", "tokens": [51392, 309, 1143, 281, 312, 300, 291, 727, 764, 300, 281, 1347, 493, 257, 2107, 12, 44495, 3089, 3861, 411, 7238, 34, 5901, 13, 51748], "temperature": 0.0, "avg_logprob": -0.16109355636264966, "compression_ratio": 1.5450643776824033, "no_speech_prob": 0.005640080664306879}, {"id": 348, "seek": 145748, "start": 1458.1200000000001, "end": 1465.16, "text": " And now that is not allowed. You can still use cloud. It's just that you need to pay for the API", "tokens": [50396, 400, 586, 300, 307, 406, 4350, 13, 509, 393, 920, 764, 4588, 13, 467, 311, 445, 300, 291, 643, 281, 1689, 337, 264, 9362, 50748], "temperature": 0.0, "avg_logprob": -0.15737777170927628, "compression_ratio": 1.5258964143426295, "no_speech_prob": 0.005553612019866705}, {"id": 349, "seek": 145748, "start": 1465.16, "end": 1472.76, "text": " that charges you per token instead of having a subscription price that very clearly you can", "tokens": [50748, 300, 12235, 291, 680, 14862, 2602, 295, 1419, 257, 17231, 3218, 300, 588, 4448, 291, 393, 51128], "temperature": 0.0, "avg_logprob": -0.15737777170927628, "compression_ratio": 1.5258964143426295, "no_speech_prob": 0.005553612019866705}, {"id": 350, "seek": 145748, "start": 1472.76, "end": 1479.4, "text": " run up a bill way beyond what you're paying for $200 per month. You can easily burn through thousands", "tokens": [51128, 1190, 493, 257, 2961, 636, 4399, 437, 291, 434, 6229, 337, 1848, 7629, 680, 1618, 13, 509, 393, 3612, 5064, 807, 5383, 51460], "temperature": 0.0, "avg_logprob": -0.15737777170927628, "compression_ratio": 1.5258964143426295, "no_speech_prob": 0.005553612019866705}, {"id": 351, "seek": 145748, "start": 1479.4, "end": 1486.1200000000001, "text": " of dollars. And yeah, there's been again, a host of like announcements similar to this where", "tokens": [51460, 295, 3808, 13, 400, 1338, 11, 456, 311, 668, 797, 11, 257, 3975, 295, 411, 23785, 2531, 281, 341, 689, 51796], "temperature": 0.0, "avg_logprob": -0.15737777170927628, "compression_ratio": 1.5258964143426295, "no_speech_prob": 0.005553612019866705}, {"id": 352, "seek": 148612, "start": 1486.84, "end": 1494.04, "text": " FropPick is tightening up restrictions. I expect because they've seen a massive influx of users.", "tokens": [50400, 1526, 404, 47, 618, 307, 42217, 493, 14191, 13, 286, 2066, 570, 436, 600, 1612, 257, 5994, 9922, 2449, 295, 5022, 13, 50760], "temperature": 0.0, "avg_logprob": -0.19240465578825577, "compression_ratio": 1.571917808219178, "no_speech_prob": 0.0015007253969088197}, {"id": 353, "seek": 148612, "start": 1494.04, "end": 1499.1599999999999, "text": " And now they actually need to start worrying about burning cash, especially with things like", "tokens": [50760, 400, 586, 436, 767, 643, 281, 722, 18788, 466, 9488, 6388, 11, 2318, 365, 721, 411, 51016], "temperature": 0.0, "avg_logprob": -0.19240465578825577, "compression_ratio": 1.571917808219178, "no_speech_prob": 0.0015007253969088197}, {"id": 354, "seek": 148612, "start": 1499.1599999999999, "end": 1505.8, "text": " OpenClaw where it's like 24, 7 agents that are supposed to be just burning through tokens.", "tokens": [51016, 7238, 34, 5901, 689, 309, 311, 411, 4022, 11, 1614, 12554, 300, 366, 3442, 281, 312, 445, 9488, 807, 22667, 13, 51348], "temperature": 0.0, "avg_logprob": -0.19240465578825577, "compression_ratio": 1.571917808219178, "no_speech_prob": 0.0015007253969088197}, {"id": 355, "seek": 148612, "start": 1505.8, "end": 1511.4799999999998, "text": " Not stop. Yeah. You know, some people are a bit peeved at on FropPick sort of changing things up", "tokens": [51348, 1726, 1590, 13, 865, 13, 509, 458, 11, 512, 561, 366, 257, 857, 21343, 937, 412, 322, 1526, 404, 47, 618, 1333, 295, 4473, 721, 493, 51632], "temperature": 0.0, "avg_logprob": -0.19240465578825577, "compression_ratio": 1.571917808219178, "no_speech_prob": 0.0015007253969088197}, {"id": 356, "seek": 148612, "start": 1511.4799999999998, "end": 1515.7199999999998, "text": " and not having a clear policy around all this. But it does indicate where we are,", "tokens": [51632, 293, 406, 1419, 257, 1850, 3897, 926, 439, 341, 13, 583, 309, 775, 13330, 689, 321, 366, 11, 51844], "temperature": 0.0, "avg_logprob": -0.19240465578825577, "compression_ratio": 1.571917808219178, "no_speech_prob": 0.0015007253969088197}, {"id": 357, "seek": 151572, "start": 1516.52, "end": 1522.3600000000001, "text": " the free launch that many of us have been enjoying in terms of being subsidized effectively", "tokens": [50404, 264, 1737, 4025, 300, 867, 295, 505, 362, 668, 9929, 294, 2115, 295, 885, 20051, 1602, 8659, 50696], "temperature": 0.0, "avg_logprob": -0.18976584866515592, "compression_ratio": 1.5684931506849316, "no_speech_prob": 0.0009546713554300368}, {"id": 358, "seek": 151572, "start": 1522.3600000000001, "end": 1529.48, "text": " to use AI for cheaper is maybe not going to be sticking around too much longer.", "tokens": [50696, 281, 764, 7318, 337, 12284, 307, 1310, 406, 516, 281, 312, 13465, 926, 886, 709, 2854, 13, 51052], "temperature": 0.0, "avg_logprob": -0.18976584866515592, "compression_ratio": 1.5684931506849316, "no_speech_prob": 0.0009546713554300368}, {"id": 359, "seek": 151572, "start": 1529.48, "end": 1533.32, "text": " Yeah. I mean, this is like a completely unsustainable all you can eat buffet, right? Like", "tokens": [51052, 865, 13, 286, 914, 11, 341, 307, 411, 257, 2584, 2693, 27219, 712, 439, 291, 393, 1862, 42904, 11, 558, 30, 1743, 51244], "temperature": 0.0, "avg_logprob": -0.18976584866515592, "compression_ratio": 1.5684931506849316, "no_speech_prob": 0.0009546713554300368}, {"id": 360, "seek": 151572, "start": 1534.44, "end": 1538.6000000000001, "text": " this could not possibly last. And I think in Theroppyc, you know, or in the awkward position where", "tokens": [51300, 341, 727, 406, 6264, 1036, 13, 400, 286, 519, 294, 334, 260, 404, 8200, 66, 11, 291, 458, 11, 420, 294, 264, 11411, 2535, 689, 51508], "temperature": 0.0, "avg_logprob": -0.18976584866515592, "compression_ratio": 1.5684931506849316, "no_speech_prob": 0.0009546713554300368}, {"id": 361, "seek": 151572, "start": 1538.6000000000001, "end": 1543.96, "text": " they have to walk this back, yes, look, it's also the case that there's a timing issue here where", "tokens": [51508, 436, 362, 281, 1792, 341, 646, 11, 2086, 11, 574, 11, 309, 311, 611, 264, 1389, 300, 456, 311, 257, 10822, 2734, 510, 689, 51776], "temperature": 0.0, "avg_logprob": -0.18976584866515592, "compression_ratio": 1.5684931506849316, "no_speech_prob": 0.0009546713554300368}, {"id": 362, "seek": 154396, "start": 1543.96, "end": 1550.1200000000001, "text": " OpenClaw's creator, right? Peter Steinberger just joined OpenAI. And that kind of makes OpenClaw", "tokens": [50364, 7238, 34, 5901, 311, 14181, 11, 558, 30, 6508, 29453, 42226, 445, 6869, 7238, 48698, 13, 400, 300, 733, 295, 1669, 7238, 34, 5901, 50672], "temperature": 0.0, "avg_logprob": -0.14902555025540865, "compression_ratio": 1.597902097902098, "no_speech_prob": 0.0013884434010833502}, {"id": 363, "seek": 154396, "start": 1550.1200000000001, "end": 1555.48, "text": " an open source project that's backed by direct competitor. And well, you know, in that context,", "tokens": [50672, 364, 1269, 4009, 1716, 300, 311, 20391, 538, 2047, 27266, 13, 400, 731, 11, 291, 458, 11, 294, 300, 4319, 11, 50940], "temperature": 0.0, "avg_logprob": -0.14902555025540865, "compression_ratio": 1.597902097902098, "no_speech_prob": 0.0013884434010833502}, {"id": 364, "seek": 154396, "start": 1555.48, "end": 1559.88, "text": " are you really going to maintain what is effectively a subsidy for OpenClaw usage?", "tokens": [50940, 366, 291, 534, 516, 281, 6909, 437, 307, 8659, 257, 49636, 337, 7238, 34, 5901, 14924, 30, 51160], "temperature": 0.0, "avg_logprob": -0.14902555025540865, "compression_ratio": 1.597902097902098, "no_speech_prob": 0.0013884434010833502}, {"id": 365, "seek": 154396, "start": 1560.68, "end": 1564.52, "text": " Maybe, maybe you won't. I mean, like, you know, I'd be surprised if that were to continue", "tokens": [51200, 2704, 11, 1310, 291, 1582, 380, 13, 286, 914, 11, 411, 11, 291, 458, 11, 286, 1116, 312, 6100, 498, 300, 645, 281, 2354, 51392], "temperature": 0.0, "avg_logprob": -0.14902555025540865, "compression_ratio": 1.597902097902098, "no_speech_prob": 0.0013884434010833502}, {"id": 366, "seek": 154396, "start": 1564.52, "end": 1569.48, "text": " independent of just this like free lunch or not free lunch. But like all you can eat buffet", "tokens": [51392, 6695, 295, 445, 341, 411, 1737, 6349, 420, 406, 1737, 6349, 13, 583, 411, 439, 291, 393, 1862, 42904, 51640], "temperature": 0.0, "avg_logprob": -0.14902555025540865, "compression_ratio": 1.597902097902098, "no_speech_prob": 0.0013884434010833502}, {"id": 367, "seek": 156948, "start": 1569.56, "end": 1574.44, "text": " economic issue, it just does not work when you have such a disparity in usage, right? You got some", "tokens": [50368, 4836, 2734, 11, 309, 445, 775, 406, 589, 562, 291, 362, 1270, 257, 47415, 294, 14924, 11, 558, 30, 509, 658, 512, 50612], "temperature": 0.0, "avg_logprob": -0.121563747011382, "compression_ratio": 1.6871345029239766, "no_speech_prob": 0.0064868442714214325}, {"id": 368, "seek": 156948, "start": 1574.44, "end": 1578.6, "text": " people who are just going to use it for, you know, anyway, more, much more lightweight stuff.", "tokens": [50612, 561, 567, 366, 445, 516, 281, 764, 309, 337, 11, 291, 458, 11, 4033, 11, 544, 11, 709, 544, 22052, 1507, 13, 50820], "temperature": 0.0, "avg_logprob": -0.121563747011382, "compression_ratio": 1.6871345029239766, "no_speech_prob": 0.0064868442714214325}, {"id": 369, "seek": 156948, "start": 1578.6, "end": 1584.3600000000001, "text": " And then your power users could just bleed you dry, right? So in that world where you have a long", "tokens": [50820, 400, 550, 428, 1347, 5022, 727, 445, 28385, 291, 4016, 11, 558, 30, 407, 294, 300, 1002, 689, 291, 362, 257, 938, 51108], "temperature": 0.0, "avg_logprob": -0.121563747011382, "compression_ratio": 1.6871345029239766, "no_speech_prob": 0.0064868442714214325}, {"id": 370, "seek": 156948, "start": 1584.3600000000001, "end": 1589.4, "text": " tail distribution of usage, you just can't go with a one size fits all approach. And that's", "tokens": [51108, 6838, 7316, 295, 14924, 11, 291, 445, 393, 380, 352, 365, 257, 472, 2744, 9001, 439, 3109, 13, 400, 300, 311, 51360], "temperature": 0.0, "avg_logprob": -0.121563747011382, "compression_ratio": 1.6871345029239766, "no_speech_prob": 0.0064868442714214325}, {"id": 371, "seek": 156948, "start": 1589.4, "end": 1593.72, "text": " what Anthropics learning. They're being very open about it. Like, it seems to their credit like a", "tokens": [51360, 437, 12727, 1513, 1167, 2539, 13, 814, 434, 885, 588, 1269, 466, 309, 13, 1743, 11, 309, 2544, 281, 641, 5397, 411, 257, 51576], "temperature": 0.0, "avg_logprob": -0.121563747011382, "compression_ratio": 1.6871345029239766, "no_speech_prob": 0.0064868442714214325}, {"id": 372, "seek": 156948, "start": 1593.72, "end": 1598.1200000000001, "text": " very transparent move that they're pulling. But the reason is very believable, but it's going to", "tokens": [51576, 588, 12737, 1286, 300, 436, 434, 8407, 13, 583, 264, 1778, 307, 588, 1351, 17915, 11, 457, 309, 311, 516, 281, 51796], "temperature": 0.0, "avg_logprob": -0.121563747011382, "compression_ratio": 1.6871345029239766, "no_speech_prob": 0.0064868442714214325}, {"id": 373, "seek": 159812, "start": 1598.12, "end": 1601.8, "text": " lead to frustrated developers. No question. And then that's the cost of doing business.", "tokens": [50364, 1477, 281, 15751, 8849, 13, 883, 1168, 13, 400, 550, 300, 311, 264, 2063, 295, 884, 1606, 13, 50548], "temperature": 0.0, "avg_logprob": -0.2210592269897461, "compression_ratio": 1.6832740213523132, "no_speech_prob": 0.0035353866405785084}, {"id": 374, "seek": 159812, "start": 1601.8, "end": 1606.84, "text": " And I think this actually is like pretty easily defendable. The more frustrating thing, which", "tokens": [50548, 400, 286, 519, 341, 767, 307, 411, 1238, 3612, 8602, 712, 13, 440, 544, 16522, 551, 11, 597, 50800], "temperature": 0.0, "avg_logprob": -0.2210592269897461, "compression_ratio": 1.6832740213523132, "no_speech_prob": 0.0035353866405785084}, {"id": 375, "seek": 159812, "start": 1606.84, "end": 1614.04, "text": " we, there's no like new story attached to it. But if you're following it, the usage limits for", "tokens": [50800, 321, 11, 456, 311, 572, 411, 777, 1657, 8570, 281, 309, 13, 583, 498, 291, 434, 3480, 309, 11, 264, 14924, 10406, 337, 51160], "temperature": 0.0, "avg_logprob": -0.2210592269897461, "compression_ratio": 1.6832740213523132, "no_speech_prob": 0.0035353866405785084}, {"id": 376, "seek": 159812, "start": 1614.04, "end": 1620.4399999999998, "text": " different subcurefantiers have been sort of fluctuating. So developers have been seeing reporting", "tokens": [51160, 819, 1422, 66, 540, 69, 394, 4890, 362, 668, 1333, 295, 23448, 32438, 13, 407, 8849, 362, 668, 2577, 10031, 51480], "temperature": 0.0, "avg_logprob": -0.2210592269897461, "compression_ratio": 1.6832740213523132, "no_speech_prob": 0.0035353866405785084}, {"id": 377, "seek": 159812, "start": 1620.4399999999998, "end": 1625.32, "text": " that they use up where usage much quicker. They have been announcements from the team that they're", "tokens": [51480, 300, 436, 764, 493, 689, 14924, 709, 16255, 13, 814, 362, 668, 23785, 490, 264, 1469, 300, 436, 434, 51724], "temperature": 0.0, "avg_logprob": -0.2210592269897461, "compression_ratio": 1.6832740213523132, "no_speech_prob": 0.0035353866405785084}, {"id": 378, "seek": 162532, "start": 1625.32, "end": 1631.72, "text": " tightening up usage bounds for like peak times, et cetera. It's maybe clear that Anthropics is", "tokens": [50364, 42217, 493, 14924, 29905, 337, 411, 10651, 1413, 11, 1030, 11458, 13, 467, 311, 1310, 1850, 300, 12727, 1513, 1167, 307, 50684], "temperature": 0.0, "avg_logprob": -0.19124927520751953, "compression_ratio": 1.6282051282051282, "no_speech_prob": 0.005218167789280415}, {"id": 379, "seek": 162532, "start": 1631.72, "end": 1638.28, "text": " under heavy compute load. There are in for seems to be struggling. And it's causing frustration.", "tokens": [50684, 833, 4676, 14722, 3677, 13, 821, 366, 294, 337, 2544, 281, 312, 9314, 13, 400, 309, 311, 9853, 20491, 13, 51012], "temperature": 0.0, "avg_logprob": -0.19124927520751953, "compression_ratio": 1.6282051282051282, "no_speech_prob": 0.005218167789280415}, {"id": 380, "seek": 162532, "start": 1638.28, "end": 1644.2, "text": " And they're having to like pull these things of actually tending up usage bounds, you know,", "tokens": [51012, 400, 436, 434, 1419, 281, 411, 2235, 613, 721, 295, 767, 256, 2029, 493, 14924, 29905, 11, 291, 458, 11, 51308], "temperature": 0.0, "avg_logprob": -0.19124927520751953, "compression_ratio": 1.6282051282051282, "no_speech_prob": 0.005218167789280415}, {"id": 381, "seek": 162532, "start": 1644.2, "end": 1651.8799999999999, "text": " removing access to free buffet options like you said for this. And it all points to the direction", "tokens": [51308, 12720, 2105, 281, 1737, 42904, 3956, 411, 291, 848, 337, 341, 13, 400, 309, 439, 2793, 281, 264, 3513, 51692], "temperature": 0.0, "avg_logprob": -0.19124927520751953, "compression_ratio": 1.6282051282051282, "no_speech_prob": 0.005218167789280415}, {"id": 382, "seek": 165188, "start": 1651.88, "end": 1658.8400000000001, "text": " of, you know, at some point, the tech policy of subsidizing users to acquire users and gain", "tokens": [50364, 295, 11, 291, 458, 11, 412, 512, 935, 11, 264, 7553, 3897, 295, 20051, 3319, 5022, 281, 20001, 5022, 293, 6052, 50712], "temperature": 0.0, "avg_logprob": -0.2247781753540039, "compression_ratio": 1.542857142857143, "no_speech_prob": 0.0028890492394566536}, {"id": 383, "seek": 165188, "start": 1658.8400000000001, "end": 1666.92, "text": " market share is going to start moving away. And it might be happening sooner than some of us may", "tokens": [50712, 2142, 2073, 307, 516, 281, 722, 2684, 1314, 13, 400, 309, 1062, 312, 2737, 15324, 813, 512, 295, 505, 815, 51116], "temperature": 0.0, "avg_logprob": -0.2247781753540039, "compression_ratio": 1.542857142857143, "no_speech_prob": 0.0028890492394566536}, {"id": 384, "seek": 165188, "start": 1666.92, "end": 1671.72, "text": " like. Yeah. And I think there's a great door cash podcast with Dariel where he talks about the", "tokens": [51116, 411, 13, 865, 13, 400, 286, 519, 456, 311, 257, 869, 2853, 6388, 7367, 365, 7803, 1187, 689, 415, 6686, 466, 264, 51356], "temperature": 0.0, "avg_logprob": -0.2247781753540039, "compression_ratio": 1.542857142857143, "no_speech_prob": 0.0028890492394566536}, {"id": 385, "seek": 165188, "start": 1671.72, "end": 1677.24, "text": " timing of scaling, right? Like when do you go for that next giga lot or next 10 giga lots now?", "tokens": [51356, 10822, 295, 21589, 11, 558, 30, 1743, 562, 360, 291, 352, 337, 300, 958, 8741, 64, 688, 420, 958, 1266, 8741, 64, 3195, 586, 30, 51632], "temperature": 0.0, "avg_logprob": -0.2247781753540039, "compression_ratio": 1.542857142857143, "no_speech_prob": 0.0028890492394566536}, {"id": 386, "seek": 167724, "start": 1677.32, "end": 1682.28, "text": " And how you think about the distribution between training and inference budgets. That's really worth", "tokens": [50368, 400, 577, 291, 519, 466, 264, 7316, 1296, 3097, 293, 38253, 26708, 13, 663, 311, 534, 3163, 50616], "temperature": 0.0, "avg_logprob": -0.19898152351379395, "compression_ratio": 1.6462395543175488, "no_speech_prob": 0.003706743475049734}, {"id": 387, "seek": 167724, "start": 1682.28, "end": 1686.44, "text": " checking out because it really does explain the situation Anthropic is in right now. You know,", "tokens": [50616, 8568, 484, 570, 309, 534, 775, 2903, 264, 2590, 12727, 1513, 299, 307, 294, 558, 586, 13, 509, 458, 11, 50824], "temperature": 0.0, "avg_logprob": -0.19898152351379395, "compression_ratio": 1.6462395543175488, "no_speech_prob": 0.003706743475049734}, {"id": 388, "seek": 167724, "start": 1686.44, "end": 1692.2, "text": " you kind of don't want to lean out too far. Opening I arguably has, right? We're going to find out", "tokens": [50824, 291, 733, 295, 500, 380, 528, 281, 11659, 484, 886, 1400, 13, 41137, 286, 26771, 575, 11, 558, 30, 492, 434, 516, 281, 915, 484, 51112], "temperature": 0.0, "avg_logprob": -0.19898152351379395, "compression_ratio": 1.6462395543175488, "no_speech_prob": 0.003706743475049734}, {"id": 389, "seek": 167724, "start": 1692.2, "end": 1696.52, "text": " pretty damn soon. If they're overlavered, she'll the compute side, but certainly Sam's been a lot more", "tokens": [51112, 1238, 8151, 2321, 13, 759, 436, 434, 670, 875, 331, 292, 11, 750, 603, 264, 14722, 1252, 11, 457, 3297, 4832, 311, 668, 257, 688, 544, 51328], "temperature": 0.0, "avg_logprob": -0.19898152351379395, "compression_ratio": 1.6462395543175488, "no_speech_prob": 0.003706743475049734}, {"id": 390, "seek": 167724, "start": 1696.52, "end": 1700.76, "text": " aggressive than Dariel just in terms of raw compute. Why up again, consistent with a company that", "tokens": [51328, 10762, 813, 7803, 1187, 445, 294, 2115, 295, 8936, 14722, 13, 1545, 493, 797, 11, 8398, 365, 257, 2237, 300, 51540], "temperature": 0.0, "avg_logprob": -0.19898152351379395, "compression_ratio": 1.6462395543175488, "no_speech_prob": 0.003706743475049734}, {"id": 391, "seek": 167724, "start": 1700.76, "end": 1705.96, "text": " goes direct to consumer too, right? That's a difference as well. Opening I has a field far more", "tokens": [51540, 1709, 2047, 281, 9711, 886, 11, 558, 30, 663, 311, 257, 2649, 382, 731, 13, 41137, 286, 575, 257, 2519, 1400, 544, 51800], "temperature": 0.0, "avg_logprob": -0.19898152351379395, "compression_ratio": 1.6462395543175488, "no_speech_prob": 0.003706743475049734}, {"id": 392, "seek": 170596, "start": 1705.96, "end": 1712.8400000000001, "text": " lower quality or lower ROI queries than Anthropic. And so it's just not in Anthropics DNA in the same", "tokens": [50364, 3126, 3125, 420, 3126, 49808, 24109, 813, 12727, 1513, 299, 13, 400, 370, 309, 311, 445, 406, 294, 12727, 1513, 1167, 8272, 294, 264, 912, 50708], "temperature": 0.0, "avg_logprob": -0.17588721381293404, "compression_ratio": 1.5685483870967742, "no_speech_prob": 0.013014138676226139}, {"id": 393, "seek": 170596, "start": 1712.8400000000001, "end": 1716.8400000000001, "text": " way. Magnum mistake. I mean, they're aggressively scaling. Everybody's aggressively scaling.", "tokens": [50708, 636, 13, 19664, 449, 6146, 13, 286, 914, 11, 436, 434, 32024, 21589, 13, 7646, 311, 32024, 21589, 13, 50908], "temperature": 0.0, "avg_logprob": -0.17588721381293404, "compression_ratio": 1.5685483870967742, "no_speech_prob": 0.013014138676226139}, {"id": 394, "seek": 170596, "start": 1716.8400000000001, "end": 1723.32, "text": " It's just a matter of how much and why. And speaking of opening I next up an update on something we", "tokens": [50908, 467, 311, 445, 257, 1871, 295, 577, 709, 293, 983, 13, 400, 4124, 295, 5193, 286, 958, 493, 364, 5623, 322, 746, 321, 51232], "temperature": 0.0, "avg_logprob": -0.17588721381293404, "compression_ratio": 1.5685483870967742, "no_speech_prob": 0.013014138676226139}, {"id": 395, "seek": 170596, "start": 1723.32, "end": 1731.88, "text": " touched on previously. Opening I is abandoning its adult mode for chat GPT. So we now have the", "tokens": [51232, 9828, 322, 8046, 13, 41137, 286, 307, 9072, 278, 1080, 5075, 4391, 337, 5081, 26039, 51, 13, 407, 321, 586, 362, 264, 51660], "temperature": 0.0, "avg_logprob": -0.17588721381293404, "compression_ratio": 1.5685483870967742, "no_speech_prob": 0.013014138676226139}, {"id": 396, "seek": 173188, "start": 1731.88, "end": 1739.24, "text": " official announcement that this NSFW erotic thing last time we reported that it was like not", "tokens": [50364, 4783, 12847, 300, 341, 15943, 37, 54, 1189, 9411, 551, 1036, 565, 321, 7055, 300, 309, 390, 411, 406, 50732], "temperature": 0.0, "avg_logprob": -0.21791822645399306, "compression_ratio": 1.6233766233766234, "no_speech_prob": 0.0024332969915121794}, {"id": 397, "seek": 173188, "start": 1739.24, "end": 1745.24, "text": " canceled officially. It was delayed. Now it is canceled officially. And this of course comes", "tokens": [50732, 24839, 12053, 13, 467, 390, 20268, 13, 823, 309, 307, 24839, 12053, 13, 400, 341, 295, 1164, 1487, 51032], "temperature": 0.0, "avg_logprob": -0.21791822645399306, "compression_ratio": 1.6233766233766234, "no_speech_prob": 0.0024332969915121794}, {"id": 398, "seek": 173188, "start": 1745.24, "end": 1751.0, "text": " after they've also asked Sora. So it seems to be an ever indicator of a strategic shift to", "tokens": [51032, 934, 436, 600, 611, 2351, 46639, 13, 407, 309, 2544, 281, 312, 364, 1562, 16961, 295, 257, 10924, 5513, 281, 51320], "temperature": 0.0, "avg_logprob": -0.21791822645399306, "compression_ratio": 1.6233766233766234, "no_speech_prob": 0.0024332969915121794}, {"id": 399, "seek": 173188, "start": 1751.0, "end": 1758.2, "text": " have been open AI to sort of focus up and kill some of these like side bets and esoteric projects.", "tokens": [51320, 362, 668, 1269, 7318, 281, 1333, 295, 1879, 493, 293, 1961, 512, 295, 613, 411, 1252, 39922, 293, 785, 21585, 299, 4455, 13, 51680], "temperature": 0.0, "avg_logprob": -0.21791822645399306, "compression_ratio": 1.6233766233766234, "no_speech_prob": 0.0024332969915121794}, {"id": 400, "seek": 175820, "start": 1759.0, "end": 1765.4, "text": " And on to Microsoft, they also have kind of lower hype, let's say, but some", "tokens": [50404, 400, 322, 281, 8116, 11, 436, 611, 362, 733, 295, 3126, 24144, 11, 718, 311, 584, 11, 457, 512, 50724], "temperature": 0.0, "avg_logprob": -0.2613407726019201, "compression_ratio": 1.4754098360655739, "no_speech_prob": 0.030626775696873665}, {"id": 401, "seek": 175820, "start": 1765.4, "end": 1774.3600000000001, "text": " notable development. They have released three new foundational models related to both images and", "tokens": [50724, 22556, 3250, 13, 814, 362, 4736, 1045, 777, 32195, 5245, 4077, 281, 1293, 5267, 293, 51172], "temperature": 0.0, "avg_logprob": -0.2613407726019201, "compression_ratio": 1.4754098360655739, "no_speech_prob": 0.030626775696873665}, {"id": 402, "seek": 175820, "start": 1774.3600000000001, "end": 1780.6000000000001, "text": " audio. They have M.A.I. Transcribe one, which is speech to text M.A.I. voice one audio generation", "tokens": [51172, 6278, 13, 814, 362, 376, 13, 32, 13, 40, 13, 6531, 8056, 472, 11, 597, 307, 6218, 281, 2487, 376, 13, 32, 13, 40, 13, 3177, 472, 6278, 5125, 51484], "temperature": 0.0, "avg_logprob": -0.2613407726019201, "compression_ratio": 1.4754098360655739, "no_speech_prob": 0.030626775696873665}, {"id": 403, "seek": 178060, "start": 1780.6, "end": 1789.48, "text": " and M.A.I. image two, which is image generation. And this is from the M.A.I. super intelligence team", "tokens": [50364, 293, 376, 13, 32, 13, 40, 13, 3256, 732, 11, 597, 307, 3256, 5125, 13, 400, 341, 307, 490, 264, 376, 13, 32, 13, 40, 13, 1687, 7599, 1469, 50808], "temperature": 0.0, "avg_logprob": -0.19218968445400023, "compression_ratio": 1.5877551020408163, "no_speech_prob": 0.0052990480326116085}, {"id": 404, "seek": 178060, "start": 1789.48, "end": 1796.12, "text": " led by Microsoft AIC, your Mustafa Suleiman, which was formed in late 2025. And this was a higher", "tokens": [50808, 4684, 538, 8116, 316, 2532, 11, 428, 37229, 318, 2271, 25504, 11, 597, 390, 8693, 294, 3469, 39209, 13, 400, 341, 390, 257, 2946, 51140], "temperature": 0.0, "avg_logprob": -0.19218968445400023, "compression_ratio": 1.5877551020408163, "no_speech_prob": 0.0052990480326116085}, {"id": 405, "seek": 178060, "start": 1796.12, "end": 1802.84, "text": " from deep mind. So kind of a big deal to have things coming out of a team. And as we know,", "tokens": [51140, 490, 2452, 1575, 13, 407, 733, 295, 257, 955, 2028, 281, 362, 721, 1348, 484, 295, 257, 1469, 13, 400, 382, 321, 458, 11, 51476], "temperature": 0.0, "avg_logprob": -0.19218968445400023, "compression_ratio": 1.5877551020408163, "no_speech_prob": 0.0052990480326116085}, {"id": 406, "seek": 178060, "start": 1802.84, "end": 1810.1999999999998, "text": " Microsoft and OpenAI relationship has been growing apart. And Microsoft is poised to try to compete", "tokens": [51476, 8116, 293, 7238, 48698, 2480, 575, 668, 4194, 4936, 13, 400, 8116, 307, 714, 2640, 281, 853, 281, 11831, 51844], "temperature": 0.0, "avg_logprob": -0.19218968445400023, "compression_ratio": 1.5877551020408163, "no_speech_prob": 0.0052990480326116085}, {"id": 407, "seek": 181020, "start": 1810.2, "end": 1816.68, "text": " in this space more. So seeing them start to release more models is a decent indicator of a three", "tokens": [50364, 294, 341, 1901, 544, 13, 407, 2577, 552, 722, 281, 4374, 544, 5245, 307, 257, 8681, 16961, 295, 257, 1045, 50688], "temperature": 0.0, "avg_logprob": -0.2667993279390557, "compression_ratio": 1.5889830508474576, "no_speech_prob": 0.0019258605316281319}, {"id": 408, "seek": 181020, "start": 1816.68, "end": 1822.76, "text": " team is spinning up. And all of the occasions are these are some solid models. They're not", "tokens": [50688, 1469, 307, 15640, 493, 13, 400, 439, 295, 264, 20641, 366, 613, 366, 512, 5100, 5245, 13, 814, 434, 406, 50992], "temperature": 0.0, "avg_logprob": -0.2667993279390557, "compression_ratio": 1.5889830508474576, "no_speech_prob": 0.0019258605316281319}, {"id": 409, "seek": 181020, "start": 1823.4, "end": 1828.6000000000001, "text": " groundbreaking or leading with pack. But Microsoft having its own models on its own", "tokens": [51024, 42491, 420, 5775, 365, 2844, 13, 583, 8116, 1419, 1080, 1065, 5245, 322, 1080, 1065, 51284], "temperature": 0.0, "avg_logprob": -0.2667993279390557, "compression_ratio": 1.5889830508474576, "no_speech_prob": 0.0019258605316281319}, {"id": 410, "seek": 181020, "start": 1828.6000000000001, "end": 1834.2, "text": " infra, et cetera, that's given some competitive advantages in terms of business, you know, positioning.", "tokens": [51284, 23654, 11, 1030, 11458, 11, 300, 311, 2212, 512, 10043, 14906, 294, 2115, 295, 1606, 11, 291, 458, 11, 26381, 13, 51564], "temperature": 0.0, "avg_logprob": -0.2667993279390557, "compression_ratio": 1.5889830508474576, "no_speech_prob": 0.0019258605316281319}, {"id": 411, "seek": 183420, "start": 1835.0800000000002, "end": 1839.96, "text": " Yeah, it seems to be a price play too, right? Like the idea here is they've got a lower price point", "tokens": [50408, 865, 11, 309, 2544, 281, 312, 257, 3218, 862, 886, 11, 558, 30, 1743, 264, 1558, 510, 307, 436, 600, 658, 257, 3126, 3218, 935, 50652], "temperature": 0.0, "avg_logprob": -0.11877856338233278, "compression_ratio": 1.6091205211726385, "no_speech_prob": 0.01427754107862711}, {"id": 412, "seek": 183420, "start": 1839.96, "end": 1845.24, "text": " in general for these models than Google and OpenAI. That matters. Cost efficiency is a big deal,", "tokens": [50652, 294, 2674, 337, 613, 5245, 813, 3329, 293, 7238, 48698, 13, 663, 7001, 13, 20863, 10493, 307, 257, 955, 2028, 11, 50916], "temperature": 0.0, "avg_logprob": -0.11877856338233278, "compression_ratio": 1.6091205211726385, "no_speech_prob": 0.01427754107862711}, {"id": 413, "seek": 183420, "start": 1845.24, "end": 1850.44, "text": " especially if you're looking at the enterprise, which is what this targets. The flip side of that is", "tokens": [50916, 2318, 498, 291, 434, 1237, 412, 264, 14132, 11, 597, 307, 437, 341, 12911, 13, 440, 7929, 1252, 295, 300, 307, 51176], "temperature": 0.0, "avg_logprob": -0.11877856338233278, "compression_ratio": 1.6091205211726385, "no_speech_prob": 0.01427754107862711}, {"id": 414, "seek": 183420, "start": 1850.44, "end": 1855.16, "text": " if you're not competing at the absolute frontier of capabilities, your margin is just going to be", "tokens": [51176, 498, 291, 434, 406, 15439, 412, 264, 8236, 35853, 295, 10862, 11, 428, 10270, 307, 445, 516, 281, 312, 51412], "temperature": 0.0, "avg_logprob": -0.11877856338233278, "compression_ratio": 1.6091205211726385, "no_speech_prob": 0.01427754107862711}, {"id": 415, "seek": 183420, "start": 1855.16, "end": 1861.0800000000002, "text": " a lot lower. Now Microsoft obviously enjoys like Google, like massive massive scale infrastructure", "tokens": [51412, 257, 688, 3126, 13, 823, 8116, 2745, 29750, 411, 3329, 11, 411, 5994, 5994, 4373, 6896, 51708], "temperature": 0.0, "avg_logprob": -0.11877856338233278, "compression_ratio": 1.6091205211726385, "no_speech_prob": 0.01427754107862711}, {"id": 416, "seek": 186108, "start": 1861.1599999999999, "end": 1865.08, "text": " that can help to support this lower price point. But still, that's a tough spot. It's an", "tokens": [50368, 300, 393, 854, 281, 1406, 341, 3126, 3218, 935, 13, 583, 920, 11, 300, 311, 257, 4930, 4008, 13, 467, 311, 364, 50564], "temperature": 0.0, "avg_logprob": -0.13595298110254553, "compression_ratio": 1.7240356083086052, "no_speech_prob": 0.009857778437435627}, {"id": 417, "seek": 186108, "start": 1865.08, "end": 1869.8799999999999, "text": " awkward spot for Microsoft to be in. They do as you say, kind of lag behind. Like it's notable.", "tokens": [50564, 11411, 4008, 337, 8116, 281, 312, 294, 13, 814, 360, 382, 291, 584, 11, 733, 295, 8953, 2261, 13, 1743, 309, 311, 22556, 13, 50804], "temperature": 0.0, "avg_logprob": -0.13595298110254553, "compression_ratio": 1.7240356083086052, "no_speech_prob": 0.009857778437435627}, {"id": 418, "seek": 186108, "start": 1869.8799999999999, "end": 1873.96, "text": " You don't think when you think of the big labs, you just don't think of Microsoft today. And they're", "tokens": [50804, 509, 500, 380, 519, 562, 291, 519, 295, 264, 955, 20339, 11, 291, 445, 500, 380, 519, 295, 8116, 965, 13, 400, 436, 434, 51008], "temperature": 0.0, "avg_logprob": -0.13595298110254553, "compression_ratio": 1.7240356083086052, "no_speech_prob": 0.009857778437435627}, {"id": 419, "seek": 186108, "start": 1873.96, "end": 1878.52, "text": " obviously trying to make up for that relationship with OpenAI has degraded. OpenAI is going to AWS.", "tokens": [51008, 2745, 1382, 281, 652, 493, 337, 300, 2480, 365, 7238, 48698, 575, 24740, 292, 13, 7238, 48698, 307, 516, 281, 17650, 13, 51236], "temperature": 0.0, "avg_logprob": -0.13595298110254553, "compression_ratio": 1.7240356083086052, "no_speech_prob": 0.009857778437435627}, {"id": 420, "seek": 186108, "start": 1878.52, "end": 1883.6399999999999, "text": " OpenAI is going outside the house to Oracle and so on for their compute needs. And so now Microsoft", "tokens": [51236, 7238, 48698, 307, 516, 2380, 264, 1782, 281, 25654, 293, 370, 322, 337, 641, 14722, 2203, 13, 400, 370, 586, 8116, 51492], "temperature": 0.0, "avg_logprob": -0.13595298110254553, "compression_ratio": 1.7240356083086052, "no_speech_prob": 0.009857778437435627}, {"id": 421, "seek": 186108, "start": 1883.6399999999999, "end": 1887.72, "text": " is kind of like forced to do this. Mustafa has been at the helm too for a long time. We're sure", "tokens": [51492, 307, 733, 295, 411, 7579, 281, 360, 341, 13, 37229, 575, 668, 412, 264, 29554, 886, 337, 257, 938, 565, 13, 492, 434, 988, 51696], "temperature": 0.0, "avg_logprob": -0.13595298110254553, "compression_ratio": 1.7240356083086052, "no_speech_prob": 0.009857778437435627}, {"id": 422, "seek": 188772, "start": 1887.88, "end": 1892.52, "text": " like long overdue. I think for something really impressive to come out of that. You know, he was", "tokens": [50372, 411, 938, 19853, 622, 13, 286, 519, 337, 746, 534, 8992, 281, 808, 484, 295, 300, 13, 509, 458, 11, 415, 390, 50604], "temperature": 0.0, "avg_logprob": -0.15208226521809895, "compression_ratio": 1.7259036144578312, "no_speech_prob": 0.004330591298639774}, {"id": 423, "seek": 188772, "start": 1892.52, "end": 1896.2, "text": " acquired along with a lot of the inflection AI team back in the day that he co-founded after", "tokens": [50604, 17554, 2051, 365, 257, 688, 295, 264, 1536, 5450, 7318, 1469, 646, 294, 264, 786, 300, 415, 598, 12, 49547, 934, 50788], "temperature": 0.0, "avg_logprob": -0.15208226521809895, "compression_ratio": 1.7259036144578312, "no_speech_prob": 0.004330591298639774}, {"id": 424, "seek": 188772, "start": 1896.2, "end": 1902.1200000000001, "text": " leaving Google. But there just hasn't been a lot of meat on the bone from him since. And I think it's", "tokens": [50788, 5012, 3329, 13, 583, 456, 445, 6132, 380, 668, 257, 688, 295, 4615, 322, 264, 9026, 490, 796, 1670, 13, 400, 286, 519, 309, 311, 51084], "temperature": 0.0, "avg_logprob": -0.15208226521809895, "compression_ratio": 1.7259036144578312, "no_speech_prob": 0.004330591298639774}, {"id": 425, "seek": 188772, "start": 1902.1200000000001, "end": 1906.3600000000001, "text": " I almost want to say it's getting awkward at this point. I'm sort of starting to feel,", "tokens": [51084, 286, 1920, 528, 281, 584, 309, 311, 1242, 11411, 412, 341, 935, 13, 286, 478, 1333, 295, 2891, 281, 841, 11, 51296], "temperature": 0.0, "avg_logprob": -0.15208226521809895, "compression_ratio": 1.7259036144578312, "no_speech_prob": 0.004330591298639774}, {"id": 426, "seek": 188772, "start": 1906.3600000000001, "end": 1910.3600000000001, "text": " you know, that what we've talked about Alex Wang over at Meta and how we just we haven't seen", "tokens": [51296, 291, 458, 11, 300, 437, 321, 600, 2825, 466, 5202, 14499, 670, 412, 6377, 64, 293, 577, 321, 445, 321, 2378, 380, 1612, 51496], "temperature": 0.0, "avg_logprob": -0.15208226521809895, "compression_ratio": 1.7259036144578312, "no_speech_prob": 0.004330591298639774}, {"id": 427, "seek": 188772, "start": 1910.3600000000001, "end": 1914.2, "text": " that model come out yet. Now we're hearing about some models are going to be open sourced at a meta,", "tokens": [51496, 300, 2316, 808, 484, 1939, 13, 823, 321, 434, 4763, 466, 512, 5245, 366, 516, 281, 312, 1269, 11006, 1232, 412, 257, 19616, 11, 51688], "temperature": 0.0, "avg_logprob": -0.15208226521809895, "compression_ratio": 1.7259036144578312, "no_speech_prob": 0.004330591298639774}, {"id": 428, "seek": 191420, "start": 1914.2, "end": 1918.6000000000001, "text": " which is never a good sign because it implies you're open sourcing the compensate from the fact that", "tokens": [50364, 597, 307, 1128, 257, 665, 1465, 570, 309, 18779, 291, 434, 1269, 11006, 2175, 264, 29458, 490, 264, 1186, 300, 50584], "temperature": 0.0, "avg_logprob": -0.14677964539087118, "compression_ratio": 1.6401384083044983, "no_speech_prob": 0.0013883651699870825}, {"id": 429, "seek": 191420, "start": 1918.6000000000001, "end": 1923.16, "text": " you're not able to compete at the kind of front to your close source and all that. Well, Alex is", "tokens": [50584, 291, 434, 406, 1075, 281, 11831, 412, 264, 733, 295, 1868, 281, 428, 1998, 4009, 293, 439, 300, 13, 1042, 11, 5202, 307, 50812], "temperature": 0.0, "avg_logprob": -0.14677964539087118, "compression_ratio": 1.6401384083044983, "no_speech_prob": 0.0013883651699870825}, {"id": 430, "seek": 191420, "start": 1923.16, "end": 1927.0, "text": " just kind of started in relative terms with stuff has been running Microsoft for a lot longer.", "tokens": [50812, 445, 733, 295, 1409, 294, 4972, 2115, 365, 1507, 575, 668, 2614, 8116, 337, 257, 688, 2854, 13, 51004], "temperature": 0.0, "avg_logprob": -0.14677964539087118, "compression_ratio": 1.6401384083044983, "no_speech_prob": 0.0013883651699870825}, {"id": 431, "seek": 191420, "start": 1927.88, "end": 1932.44, "text": " So I think we're now at the point where like I don't know, I'm not sure if there's going to be", "tokens": [51048, 407, 286, 519, 321, 434, 586, 412, 264, 935, 689, 411, 286, 500, 380, 458, 11, 286, 478, 406, 988, 498, 456, 311, 516, 281, 312, 51276], "temperature": 0.0, "avg_logprob": -0.14677964539087118, "compression_ratio": 1.6401384083044983, "no_speech_prob": 0.0013883651699870825}, {"id": 432, "seek": 191420, "start": 1932.44, "end": 1936.44, "text": " a change of personnel there, but it wouldn't surprise me if we see that at some point.", "tokens": [51276, 257, 1319, 295, 14988, 456, 11, 457, 309, 2759, 380, 6365, 385, 498, 321, 536, 300, 412, 512, 935, 13, 51476], "temperature": 0.0, "avg_logprob": -0.14677964539087118, "compression_ratio": 1.6401384083044983, "no_speech_prob": 0.0013883651699870825}, {"id": 433, "seek": 193644, "start": 1937.4, "end": 1945.3200000000002, "text": " Right. Just good correction. I said that he started as Velid in late 2025. This particular team,", "tokens": [50412, 1779, 13, 1449, 665, 19984, 13, 286, 848, 300, 415, 1409, 382, 17814, 327, 294, 3469, 39209, 13, 639, 1729, 1469, 11, 50808], "temperature": 0.0, "avg_logprob": -0.32243044236127066, "compression_ratio": 1.5330739299610896, "no_speech_prob": 0.07141298055648804}, {"id": 434, "seek": 193644, "start": 1945.3200000000002, "end": 1951.0800000000002, "text": " the super intelligence team, Lovyn, Microsoft started in November of 25, or at least was announced.", "tokens": [50808, 264, 1687, 7599, 1469, 11, 441, 5179, 2534, 11, 8116, 1409, 294, 7674, 295, 3552, 11, 420, 412, 1935, 390, 7548, 13, 51096], "temperature": 0.0, "avg_logprob": -0.32243044236127066, "compression_ratio": 1.5330739299610896, "no_speech_prob": 0.07141298055648804}, {"id": 435, "seek": 193644, "start": 1951.0800000000002, "end": 1955.64, "text": " So I think there was a strategic shift. Not the around that point where it's like, oh, we haven't", "tokens": [51096, 407, 286, 519, 456, 390, 257, 10924, 5513, 13, 1726, 264, 926, 300, 935, 689, 309, 311, 411, 11, 1954, 11, 321, 2378, 380, 51324], "temperature": 0.0, "avg_logprob": -0.32243044236127066, "compression_ratio": 1.5330739299610896, "no_speech_prob": 0.07141298055648804}, {"id": 436, "seek": 193644, "start": 1955.64, "end": 1960.28, "text": " done much on the model slide. Let's actually do it. We may start seeing more of that sort of thing.", "tokens": [51324, 1096, 709, 322, 264, 2316, 4137, 13, 961, 311, 767, 360, 309, 13, 492, 815, 722, 2577, 544, 295, 300, 1333, 295, 551, 13, 51556], "temperature": 0.0, "avg_logprob": -0.32243044236127066, "compression_ratio": 1.5330739299610896, "no_speech_prob": 0.07141298055648804}, {"id": 437, "seek": 196028, "start": 1960.28, "end": 1965.96, "text": " We are saying you'll start seeing more models come out on our fondry and so on. So it", "tokens": [50364, 492, 366, 1566, 291, 603, 722, 2577, 544, 5245, 808, 484, 322, 527, 9557, 627, 293, 370, 322, 13, 407, 309, 50648], "temperature": 0.0, "avg_logprob": -0.21716324942452567, "compression_ratio": 1.6666666666666667, "no_speech_prob": 0.04954354092478752}, {"id": 438, "seek": 196028, "start": 1965.96, "end": 1971.3999999999999, "text": " either could be indication that the team has spun up. And it's now going to start spinning off", "tokens": [50648, 2139, 727, 312, 18877, 300, 264, 1469, 575, 37038, 493, 13, 400, 309, 311, 586, 516, 281, 722, 15640, 766, 50920], "temperature": 0.0, "avg_logprob": -0.21716324942452567, "compression_ratio": 1.6666666666666667, "no_speech_prob": 0.04954354092478752}, {"id": 439, "seek": 196028, "start": 1971.3999999999999, "end": 1977.08, "text": " more or as you said, it could be negative of trouble whether or not quite moving fast enough.", "tokens": [50920, 544, 420, 382, 291, 848, 11, 309, 727, 312, 3671, 295, 5253, 1968, 420, 406, 1596, 2684, 2370, 1547, 13, 51204], "temperature": 0.0, "avg_logprob": -0.21716324942452567, "compression_ratio": 1.6666666666666667, "no_speech_prob": 0.04954354092478752}, {"id": 440, "seek": 196028, "start": 1977.08, "end": 1981.08, "text": " It's a bit of a reframe too, right? Like we know Microsoft has been desperately trying to be", "tokens": [51204, 467, 311, 257, 857, 295, 257, 13334, 529, 886, 11, 558, 30, 1743, 321, 458, 8116, 575, 668, 23726, 1382, 281, 312, 51404], "temperature": 0.0, "avg_logprob": -0.21716324942452567, "compression_ratio": 1.6666666666666667, "no_speech_prob": 0.04954354092478752}, {"id": 441, "seek": 196028, "start": 1981.08, "end": 1984.68, "text": " relevant on frontier models. This whole time, it's not like this is the first time Mustafa", "tokens": [51404, 7340, 322, 35853, 5245, 13, 639, 1379, 565, 11, 309, 311, 406, 411, 341, 307, 264, 700, 565, 37229, 51584], "temperature": 0.0, "avg_logprob": -0.21716324942452567, "compression_ratio": 1.6666666666666667, "no_speech_prob": 0.04954354092478752}, {"id": 442, "seek": 196028, "start": 1984.68, "end": 1989.16, "text": " Salaman is going like, let's go and do it. Like let's actually be relevant up there with Open AI", "tokens": [51584, 5996, 6147, 307, 516, 411, 11, 718, 311, 352, 293, 360, 309, 13, 1743, 718, 311, 767, 312, 7340, 493, 456, 365, 7238, 7318, 51808], "temperature": 0.0, "avg_logprob": -0.21716324942452567, "compression_ratio": 1.6666666666666667, "no_speech_prob": 0.04954354092478752}, {"id": 443, "seek": 198916, "start": 1989.24, "end": 1993.8000000000002, "text": " and whatnot. They've had the five series of models. They've been trying to make stuff happen.", "tokens": [50368, 293, 25882, 13, 814, 600, 632, 264, 1732, 2638, 295, 5245, 13, 814, 600, 668, 1382, 281, 652, 1507, 1051, 13, 50596], "temperature": 0.0, "avg_logprob": -0.2047837730345687, "compression_ratio": 1.6254295532646048, "no_speech_prob": 0.060032639652490616}, {"id": 444, "seek": 198916, "start": 1993.8000000000002, "end": 1999.8000000000002, "text": " You know, call it a rebranding of the effort of refocusing. Yeah, I don't know. I'm curious to", "tokens": [50596, 509, 458, 11, 818, 309, 257, 12970, 3699, 278, 295, 264, 4630, 295, 1895, 905, 7981, 13, 865, 11, 286, 500, 380, 458, 13, 286, 478, 6369, 281, 50896], "temperature": 0.0, "avg_logprob": -0.2047837730345687, "compression_ratio": 1.6254295532646048, "no_speech_prob": 0.060032639652490616}, {"id": 445, "seek": 198916, "start": 1999.8000000000002, "end": 2005.16, "text": " see or hear behind the scenes because they did have a pretty tight relationship with Open AI", "tokens": [50896, 536, 420, 1568, 2261, 264, 8026, 570, 436, 630, 362, 257, 1238, 4524, 2480, 365, 7238, 7318, 51164], "temperature": 0.0, "avg_logprob": -0.2047837730345687, "compression_ratio": 1.6254295532646048, "no_speech_prob": 0.060032639652490616}, {"id": 446, "seek": 198916, "start": 2005.16, "end": 2012.3600000000001, "text": " until 2025-ish. So yeah, I don't know. Next thing, I guess on the five series, right? Like the", "tokens": [51164, 1826, 39209, 12, 742, 13, 407, 1338, 11, 286, 500, 380, 458, 13, 3087, 551, 11, 286, 2041, 322, 264, 1732, 2638, 11, 558, 30, 1743, 264, 51524], "temperature": 0.0, "avg_logprob": -0.2047837730345687, "compression_ratio": 1.6254295532646048, "no_speech_prob": 0.060032639652490616}, {"id": 447, "seek": 198916, "start": 2012.3600000000001, "end": 2018.2, "text": " stated intent there was to have an independent like solid foundation model stack. And for those,", "tokens": [51524, 11323, 8446, 456, 390, 281, 362, 364, 6695, 411, 5100, 7030, 2316, 8630, 13, 400, 337, 729, 11, 51816], "temperature": 0.0, "avg_logprob": -0.2047837730345687, "compression_ratio": 1.6254295532646048, "no_speech_prob": 0.060032639652490616}, {"id": 448, "seek": 201820, "start": 2018.2, "end": 2023.0, "text": " yeah, for those who haven't been around recovered, it was a whole series of models, which were", "tokens": [50364, 1338, 11, 337, 729, 567, 2378, 380, 668, 926, 19542, 11, 309, 390, 257, 1379, 2638, 295, 5245, 11, 597, 645, 50604], "temperature": 0.0, "avg_logprob": -0.1910535796614718, "compression_ratio": 1.6785714285714286, "no_speech_prob": 0.002672036876901984}, {"id": 449, "seek": 201820, "start": 2023.0, "end": 2029.0800000000002, "text": " pretty solid, small models. So they released these like one billion, seven billion parameter models,", "tokens": [50604, 1238, 5100, 11, 1359, 5245, 13, 407, 436, 4736, 613, 411, 472, 5218, 11, 3407, 5218, 13075, 5245, 11, 50908], "temperature": 0.0, "avg_logprob": -0.1910535796614718, "compression_ratio": 1.6785714285714286, "no_speech_prob": 0.002672036876901984}, {"id": 450, "seek": 201820, "start": 2029.0800000000002, "end": 2036.44, "text": " had a whole series of them. And yeah, we're working on models, but not big models. And", "tokens": [50908, 632, 257, 1379, 2638, 295, 552, 13, 400, 1338, 11, 321, 434, 1364, 322, 5245, 11, 457, 406, 955, 5245, 13, 400, 51276], "temperature": 0.0, "avg_logprob": -0.1910535796614718, "compression_ratio": 1.6785714285714286, "no_speech_prob": 0.002672036876901984}, {"id": 451, "seek": 201820, "start": 2037.24, "end": 2041.56, "text": " it could be the case that they were not trying to compete because it's so capital intensive to", "tokens": [51316, 309, 727, 312, 264, 1389, 300, 436, 645, 406, 1382, 281, 11831, 570, 309, 311, 370, 4238, 18957, 281, 51532], "temperature": 0.0, "avg_logprob": -0.1910535796614718, "compression_ratio": 1.6785714285714286, "no_speech_prob": 0.002672036876901984}, {"id": 452, "seek": 201820, "start": 2041.56, "end": 2047.72, "text": " build a sonnet or a GPT 5.4. And now they are. But it's another, but poncho reading service.", "tokens": [51532, 1322, 257, 1872, 7129, 420, 257, 26039, 51, 1025, 13, 19, 13, 400, 586, 436, 366, 13, 583, 309, 311, 1071, 11, 457, 9224, 5738, 3760, 2643, 13, 51840], "temperature": 0.0, "avg_logprob": -0.1910535796614718, "compression_ratio": 1.6785714285714286, "no_speech_prob": 0.002672036876901984}, {"id": 453, "seek": 204772, "start": 2047.72, "end": 2050.92, "text": " Absolutely. Yeah, they could, you're right. They could be thinking about their distribution and", "tokens": [50364, 7021, 13, 865, 11, 436, 727, 11, 291, 434, 558, 13, 814, 727, 312, 1953, 466, 641, 7316, 293, 50524], "temperature": 0.0, "avg_logprob": -0.29539410821322737, "compression_ratio": 1.7351778656126482, "no_speech_prob": 0.005214736331254244}, {"id": 454, "seek": 204772, "start": 2050.92, "end": 2055.7200000000003, "text": " go, what's a small cheap way to get this out to all of our, you know, billions of users.", "tokens": [50524, 352, 11, 437, 311, 257, 1359, 7084, 636, 281, 483, 341, 484, 281, 439, 295, 527, 11, 291, 458, 11, 17375, 295, 5022, 13, 50764], "temperature": 0.0, "avg_logprob": -0.29539410821322737, "compression_ratio": 1.7351778656126482, "no_speech_prob": 0.005214736331254244}, {"id": 455, "seek": 204772, "start": 2055.7200000000003, "end": 2059.8, "text": " Absolutely. Apple bring the same thing, you know, training a little models.", "tokens": [50764, 7021, 13, 6373, 1565, 264, 912, 551, 11, 291, 458, 11, 3097, 257, 707, 5245, 13, 50968], "temperature": 0.0, "avg_logprob": -0.29539410821322737, "compression_ratio": 1.7351778656126482, "no_speech_prob": 0.005214736331254244}, {"id": 456, "seek": 204772, "start": 2059.8, "end": 2065.0, "text": " Yeah. I've got to get through you know. Yeah. At some point, your research team only gets so", "tokens": [50968, 865, 13, 286, 600, 658, 281, 483, 807, 291, 458, 13, 865, 13, 1711, 512, 935, 11, 428, 2132, 1469, 787, 2170, 370, 51228], "temperature": 0.0, "avg_logprob": -0.29539410821322737, "compression_ratio": 1.7351778656126482, "no_speech_prob": 0.005214736331254244}, {"id": 457, "seek": 204772, "start": 2065.0, "end": 2070.84, "text": " much compute to play with, you know, that's right. Yeah. And one last tool app story,", "tokens": [51228, 709, 14722, 281, 862, 365, 11, 291, 458, 11, 300, 311, 558, 13, 865, 13, 400, 472, 1036, 2290, 724, 1657, 11, 51520], "temperature": 0.0, "avg_logprob": -0.29539410821322737, "compression_ratio": 1.7351778656126482, "no_speech_prob": 0.005214736331254244}, {"id": 458, "seek": 207084, "start": 2070.84, "end": 2077.8, "text": " Suno is leading into customization. We've V 5.5. We don't have that many stories about", "tokens": [50364, 318, 12638, 307, 5775, 666, 39387, 13, 492, 600, 691, 1025, 13, 20, 13, 492, 500, 380, 362, 300, 867, 3676, 466, 50712], "temperature": 0.0, "avg_logprob": -0.21345394913868238, "compression_ratio": 1.5485232067510548, "no_speech_prob": 0.014941398054361343}, {"id": 459, "seek": 207084, "start": 2077.8, "end": 2083.6400000000003, "text": " music generation these days, which is kind of surprising or interesting. Still, there's only one", "tokens": [50712, 1318, 5125, 613, 1708, 11, 597, 307, 733, 295, 8830, 420, 1880, 13, 8291, 11, 456, 311, 787, 472, 51004], "temperature": 0.0, "avg_logprob": -0.21345394913868238, "compression_ratio": 1.5485232067510548, "no_speech_prob": 0.014941398054361343}, {"id": 460, "seek": 207084, "start": 2083.6400000000003, "end": 2090.1200000000003, "text": " real leader in the space, which is Suno, the competitor, UDO, it has been a little quieter. And", "tokens": [51004, 957, 5263, 294, 264, 1901, 11, 597, 307, 318, 12638, 11, 264, 27266, 11, 624, 26649, 11, 309, 575, 668, 257, 707, 43339, 13, 400, 51328], "temperature": 0.0, "avg_logprob": -0.21345394913868238, "compression_ratio": 1.5485232067510548, "no_speech_prob": 0.014941398054361343}, {"id": 461, "seek": 207084, "start": 2091.0, "end": 2097.48, "text": " here, what they're highlighting is an ability to customize with free and user features.", "tokens": [51372, 510, 11, 437, 436, 434, 26551, 307, 364, 3485, 281, 19734, 365, 1737, 293, 4195, 4122, 13, 51696], "temperature": 0.0, "avg_logprob": -0.21345394913868238, "compression_ratio": 1.5485232067510548, "no_speech_prob": 0.014941398054361343}, {"id": 462, "seek": 209748, "start": 2097.48, "end": 2105.32, "text": " Voices might taste and custom models. So the kind of pitches, you can make it a much more", "tokens": [50364, 7518, 1473, 1062, 3939, 293, 2375, 5245, 13, 407, 264, 733, 295, 43110, 11, 291, 393, 652, 309, 257, 709, 544, 50756], "temperature": 0.0, "avg_logprob": -0.1323762469821506, "compression_ratio": 1.5726141078838174, "no_speech_prob": 0.26131966710090637}, {"id": 463, "seek": 209748, "start": 2105.32, "end": 2110.52, "text": " personalized output. You can actually make it have your voice as opposed to just prompting it to have", "tokens": [50756, 28415, 5598, 13, 509, 393, 767, 652, 309, 362, 428, 3177, 382, 8851, 281, 445, 12391, 278, 309, 281, 362, 51016], "temperature": 0.0, "avg_logprob": -0.1323762469821506, "compression_ratio": 1.5726141078838174, "no_speech_prob": 0.26131966710090637}, {"id": 464, "seek": 209748, "start": 2111.32, "end": 2115.8, "text": " the voice of some famous singer, which you're not supposed to do, but you could probably still do", "tokens": [51056, 264, 3177, 295, 512, 4618, 11564, 11, 597, 291, 434, 406, 3442, 281, 360, 11, 457, 291, 727, 1391, 920, 360, 51280], "temperature": 0.0, "avg_logprob": -0.1323762469821506, "compression_ratio": 1.5726141078838174, "no_speech_prob": 0.26131966710090637}, {"id": 465, "seek": 209748, "start": 2115.8, "end": 2122.28, "text": " via like clever wording. And similarly, my taste is going to learn your preferred genres,", "tokens": [51280, 5766, 411, 13494, 47602, 13, 400, 14138, 11, 452, 3939, 307, 516, 281, 1466, 428, 16494, 30057, 11, 51604], "temperature": 0.0, "avg_logprob": -0.1323762469821506, "compression_ratio": 1.5726141078838174, "no_speech_prob": 0.26131966710090637}, {"id": 466, "seek": 212228, "start": 2122.28, "end": 2129.88, "text": " moods and artists. And custom models allow you to train it on your own music catalog with a", "tokens": [50364, 9268, 82, 293, 6910, 13, 400, 2375, 5245, 2089, 291, 281, 3847, 309, 322, 428, 1065, 1318, 19746, 365, 257, 50744], "temperature": 0.0, "avg_logprob": -0.14374984040552255, "compression_ratio": 1.668122270742358, "no_speech_prob": 0.0036481383722275496}, {"id": 467, "seek": 212228, "start": 2129.88, "end": 2137.48, "text": " minimum of six tracks. So very interesting move to me from Suno as kind of a bet on if music", "tokens": [50744, 7285, 295, 2309, 10218, 13, 407, 588, 1880, 1286, 281, 385, 490, 318, 12638, 382, 733, 295, 257, 778, 322, 498, 1318, 51124], "temperature": 0.0, "avg_logprob": -0.14374984040552255, "compression_ratio": 1.668122270742358, "no_speech_prob": 0.0036481383722275496}, {"id": 468, "seek": 212228, "start": 2137.48, "end": 2144.0400000000004, "text": " generation becomes a thing, one way to frame it in a like nice way is, you know, these are music", "tokens": [51124, 5125, 3643, 257, 551, 11, 472, 636, 281, 3920, 309, 294, 257, 411, 1481, 636, 307, 11, 291, 458, 11, 613, 366, 1318, 51452], "temperature": 0.0, "avg_logprob": -0.14374984040552255, "compression_ratio": 1.668122270742358, "no_speech_prob": 0.0036481383722275496}, {"id": 469, "seek": 212228, "start": 2144.0400000000004, "end": 2150.84, "text": " things cater to your taste or if you're an artist, tater to your voice and the kind of musical style", "tokens": [51452, 721, 21557, 281, 428, 3939, 420, 498, 291, 434, 364, 5748, 11, 256, 771, 281, 428, 3177, 293, 264, 733, 295, 9165, 3758, 51792], "temperature": 0.0, "avg_logprob": -0.14374984040552255, "compression_ratio": 1.668122270742358, "no_speech_prob": 0.0036481383722275496}, {"id": 470, "seek": 215084, "start": 2150.84, "end": 2157.1600000000003, "text": " as opposed to just like with the spinning out slop and replacing real artists onto applications", "tokens": [50364, 382, 8851, 281, 445, 411, 365, 264, 15640, 484, 21254, 293, 19139, 957, 6910, 3911, 5821, 50680], "temperature": 0.0, "avg_logprob": -0.2878595239975873, "compression_ratio": 1.5975103734439835, "no_speech_prob": 0.005217777565121651}, {"id": 471, "seek": 215084, "start": 2157.1600000000003, "end": 2162.36, "text": " and business touching on and fronpa again related to that compete question we were just saying,", "tokens": [50680, 293, 1606, 11175, 322, 293, 431, 266, 4306, 797, 4077, 281, 300, 11831, 1168, 321, 645, 445, 1566, 11, 50940], "temperature": 0.0, "avg_logprob": -0.2878595239975873, "compression_ratio": 1.5975103734439835, "no_speech_prob": 0.005217777565121651}, {"id": 472, "seek": 215084, "start": 2163.08, "end": 2171.1600000000003, "text": " they announced first that they have a huge amount of revenue. So their revenue run rate has now", "tokens": [50976, 436, 7548, 700, 300, 436, 362, 257, 2603, 2372, 295, 9324, 13, 407, 641, 9324, 1190, 3314, 575, 586, 51380], "temperature": 0.0, "avg_logprob": -0.2878595239975873, "compression_ratio": 1.5975103734439835, "no_speech_prob": 0.005217777565121651}, {"id": 473, "seek": 215084, "start": 2171.1600000000003, "end": 2178.6800000000003, "text": " surpassed 30 billion dollars jumping from about 90 billion at the end of 2025. So they've tripled", "tokens": [51380, 27650, 292, 2217, 5218, 3808, 11233, 490, 466, 4289, 5218, 412, 264, 917, 295, 39209, 13, 407, 436, 600, 1376, 15551, 51756], "temperature": 0.0, "avg_logprob": -0.2878595239975873, "compression_ratio": 1.5975103734439835, "no_speech_prob": 0.005217777565121651}, {"id": 474, "seek": 217868, "start": 2178.68, "end": 2185.24, "text": " more ventrupled revenue in something like three months. That's insane. Yeah, if you look at the", "tokens": [50364, 544, 6931, 894, 15551, 9324, 294, 746, 411, 1045, 2493, 13, 663, 311, 10838, 13, 865, 11, 498, 291, 574, 412, 264, 50692], "temperature": 0.0, "avg_logprob": -0.3469276034954897, "compression_ratio": 1.556, "no_speech_prob": 0.003171806689351797}, {"id": 475, "seek": 217868, "start": 2185.24, "end": 2192.04, "text": " graph, it is insane. It looks like, you know, there is a marked shift in the slow for an profit", "tokens": [50692, 4295, 11, 309, 307, 10838, 13, 467, 1542, 411, 11, 291, 458, 11, 456, 307, 257, 12658, 5513, 294, 264, 2964, 337, 364, 7475, 51032], "temperature": 0.0, "avg_logprob": -0.3469276034954897, "compression_ratio": 1.556, "no_speech_prob": 0.003171806689351797}, {"id": 476, "seek": 217868, "start": 2192.04, "end": 2198.9199999999996, "text": " around van of 2025 when kind of hype for God goat starting kicking off. Clearly adoption has been", "tokens": [51032, 926, 3161, 295, 39209, 562, 733, 295, 24144, 337, 1265, 23608, 2891, 19137, 766, 13, 24120, 19215, 575, 668, 51376], "temperature": 0.0, "avg_logprob": -0.3469276034954897, "compression_ratio": 1.556, "no_speech_prob": 0.003171806689351797}, {"id": 477, "seek": 217868, "start": 2198.9199999999996, "end": 2205.3999999999996, "text": " accelerating and going to pay rapid pace, which is as we've said, probably why an profit has had to", "tokens": [51376, 34391, 293, 516, 281, 1689, 7558, 11638, 11, 597, 307, 382, 321, 600, 848, 11, 1391, 983, 364, 7475, 575, 632, 281, 51700], "temperature": 0.0, "avg_logprob": -0.3469276034954897, "compression_ratio": 1.556, "no_speech_prob": 0.003171806689351797}, {"id": 478, "seek": 220540, "start": 2206.04, "end": 2212.28, "text": " tighten up. So along with this announcement, they also have a new compute agreement with Google", "tokens": [50396, 17041, 493, 13, 407, 2051, 365, 341, 12847, 11, 436, 611, 362, 257, 777, 14722, 8106, 365, 3329, 50708], "temperature": 0.0, "avg_logprob": -0.13097178046383076, "compression_ratio": 1.5210526315789474, "no_speech_prob": 0.006385070271790028}, {"id": 479, "seek": 220540, "start": 2212.28, "end": 2220.36, "text": " and Broadcom, which will expand its access to Google TPU servers. This is an expansion of an", "tokens": [50708, 293, 14074, 1112, 11, 597, 486, 5268, 1080, 2105, 281, 3329, 314, 8115, 15909, 13, 639, 307, 364, 11260, 295, 364, 51112], "temperature": 0.0, "avg_logprob": -0.13097178046383076, "compression_ratio": 1.5210526315789474, "no_speech_prob": 0.006385070271790028}, {"id": 480, "seek": 220540, "start": 2220.36, "end": 2227.08, "text": " arrangement they had in October of 2025. So this will give them another gigawatt of compute capacity", "tokens": [51112, 17620, 436, 632, 294, 7617, 295, 39209, 13, 407, 341, 486, 976, 552, 1071, 8741, 1607, 1591, 295, 14722, 6042, 51448], "temperature": 0.0, "avg_logprob": -0.13097178046383076, "compression_ratio": 1.5210526315789474, "no_speech_prob": 0.006385070271790028}, {"id": 481, "seek": 222708, "start": 2227.08, "end": 2235.64, "text": " in 2026. So actually, that was a gigawatt originally now, this is giving them an additional 3.5", "tokens": [50364, 294, 945, 10880, 13, 407, 767, 11, 300, 390, 257, 8741, 1607, 1591, 7993, 586, 11, 341, 307, 2902, 552, 364, 4497, 805, 13, 20, 50792], "temperature": 0.0, "avg_logprob": -0.1950079110952524, "compression_ratio": 1.5495867768595042, "no_speech_prob": 0.003943168558180332}, {"id": 482, "seek": 222708, "start": 2235.64, "end": 2243.96, "text": " gigawatts of TPU based compute starting in 2027. So yeah, clearly on tropic making moves here.", "tokens": [50792, 8741, 38036, 1373, 295, 314, 8115, 2361, 14722, 2891, 294, 945, 10076, 13, 407, 1338, 11, 4448, 322, 9006, 299, 1455, 6067, 510, 13, 51208], "temperature": 0.0, "avg_logprob": -0.1950079110952524, "compression_ratio": 1.5495867768595042, "no_speech_prob": 0.003943168558180332}, {"id": 483, "seek": 222708, "start": 2244.68, "end": 2250.36, "text": " Yeah, and you know, you're so the increase in in an tropics run rate is insane by any measure.", "tokens": [51244, 865, 11, 293, 291, 458, 11, 291, 434, 370, 264, 3488, 294, 294, 364, 9006, 1167, 1190, 3314, 307, 10838, 538, 604, 3481, 13, 51528], "temperature": 0.0, "avg_logprob": -0.1950079110952524, "compression_ratio": 1.5495867768595042, "no_speech_prob": 0.003943168558180332}, {"id": 484, "seek": 222708, "start": 2250.36, "end": 2256.6, "text": " I'm not aware of any company in in human history that has grown that fast. Now, you might", "tokens": [51528, 286, 478, 406, 3650, 295, 604, 2237, 294, 294, 1952, 2503, 300, 575, 7709, 300, 2370, 13, 823, 11, 291, 1062, 51840], "temperature": 0.0, "avg_logprob": -0.1950079110952524, "compression_ratio": 1.5495867768595042, "no_speech_prob": 0.003943168558180332}, {"id": 485, "seek": 225660, "start": 2256.6, "end": 2261.3199999999997, "text": " say did they have a lucky quarter or is this a fluke? So when you dig into the numbers, there's", "tokens": [50364, 584, 630, 436, 362, 257, 6356, 6555, 420, 307, 341, 257, 5029, 330, 30, 407, 562, 291, 2528, 666, 264, 3547, 11, 456, 311, 50600], "temperature": 0.0, "avg_logprob": -0.14829088079518285, "compression_ratio": 1.67595818815331, "no_speech_prob": 0.009556176140904427}, {"id": 486, "seek": 225660, "start": 2261.3199999999997, "end": 2266.68, "text": " more than 1000 business customers that are now spending over a million dollars per year. That's more", "tokens": [50600, 544, 813, 9714, 1606, 4581, 300, 366, 586, 6434, 670, 257, 2459, 3808, 680, 1064, 13, 663, 311, 544, 50868], "temperature": 0.0, "avg_logprob": -0.14829088079518285, "compression_ratio": 1.67595818815331, "no_speech_prob": 0.009556176140904427}, {"id": 487, "seek": 225660, "start": 2266.68, "end": 2273.0, "text": " than doubled since February. So you're talking about doubling your $1 million plus per year", "tokens": [50868, 813, 24405, 1670, 8711, 13, 407, 291, 434, 1417, 466, 33651, 428, 1848, 16, 2459, 1804, 680, 1064, 51184], "temperature": 0.0, "avg_logprob": -0.14829088079518285, "compression_ratio": 1.67595818815331, "no_speech_prob": 0.009556176140904427}, {"id": 488, "seek": 225660, "start": 2273.0, "end": 2279.08, "text": " customer count in two months. That is not just a fluke thing. It's like actual stickiness here", "tokens": [51184, 5474, 1207, 294, 732, 2493, 13, 663, 307, 406, 445, 257, 5029, 330, 551, 13, 467, 311, 411, 3539, 2897, 1324, 510, 51488], "temperature": 0.0, "avg_logprob": -0.14829088079518285, "compression_ratio": 1.67595818815331, "no_speech_prob": 0.009556176140904427}, {"id": 489, "seek": 225660, "start": 2279.08, "end": 2283.96, "text": " with companies that that have real stakes stakes in this. So this is pretty wild. There's a whole", "tokens": [51488, 365, 3431, 300, 300, 362, 957, 28429, 28429, 294, 341, 13, 407, 341, 307, 1238, 4868, 13, 821, 311, 257, 1379, 51732], "temperature": 0.0, "avg_logprob": -0.14829088079518285, "compression_ratio": 1.67595818815331, "no_speech_prob": 0.009556176140904427}, {"id": 490, "seek": 228396, "start": 2283.96, "end": 2288.84, "text": " bunch of stuff to dig into here. I mean, so Broadcom's got an SEC filing that does say that the", "tokens": [50364, 3840, 295, 1507, 281, 2528, 666, 510, 13, 286, 914, 11, 370, 14074, 1112, 311, 658, 364, 22399, 26854, 300, 775, 584, 300, 264, 50608], "temperature": 0.0, "avg_logprob": -0.15941273988182866, "compression_ratio": 1.7886435331230284, "no_speech_prob": 0.002472399268299341}, {"id": 491, "seek": 228396, "start": 2288.84, "end": 2293.96, "text": " consumption of this expanded AI cloud compute capacity by a tropic is dependent on", "tokens": [50608, 12126, 295, 341, 14342, 7318, 4588, 14722, 6042, 538, 257, 9006, 299, 307, 12334, 322, 50864], "temperature": 0.0, "avg_logprob": -0.15941273988182866, "compression_ratio": 1.7886435331230284, "no_speech_prob": 0.002472399268299341}, {"id": 492, "seek": 228396, "start": 2293.96, "end": 2299.4, "text": " and tropics continued commercial success. So there's presumably conditions baked into that agreement", "tokens": [50864, 293, 9006, 1167, 7014, 6841, 2245, 13, 407, 456, 311, 26742, 4487, 19453, 666, 300, 8106, 51136], "temperature": 0.0, "avg_logprob": -0.15941273988182866, "compression_ratio": 1.7886435331230284, "no_speech_prob": 0.002472399268299341}, {"id": 493, "seek": 228396, "start": 2299.4, "end": 2303.8, "text": " that you know, and the topic has to continue to do this so that Broadcom continues to supply the", "tokens": [51136, 300, 291, 458, 11, 293, 264, 4829, 575, 281, 2354, 281, 360, 341, 370, 300, 14074, 1112, 6515, 281, 5847, 264, 51356], "temperature": 0.0, "avg_logprob": -0.15941273988182866, "compression_ratio": 1.7886435331230284, "no_speech_prob": 0.002472399268299341}, {"id": 494, "seek": 228396, "start": 2303.8, "end": 2307.8, "text": " chips. And that's, you know, what you would expect. I mean, there's so much volatility, so much", "tokens": [51356, 11583, 13, 400, 300, 311, 11, 291, 458, 11, 437, 291, 576, 2066, 13, 286, 914, 11, 456, 311, 370, 709, 25877, 11, 370, 709, 51556], "temperature": 0.0, "avg_logprob": -0.15941273988182866, "compression_ratio": 1.7886435331230284, "no_speech_prob": 0.002472399268299341}, {"id": 495, "seek": 228396, "start": 2307.8, "end": 2312.76, "text": " uncertainty here. But the other piece here is there is this broader thing to keep in mind like", "tokens": [51556, 15697, 510, 13, 583, 264, 661, 2522, 510, 307, 456, 307, 341, 13227, 551, 281, 1066, 294, 1575, 411, 51804], "temperature": 0.0, "avg_logprob": -0.15941273988182866, "compression_ratio": 1.7886435331230284, "no_speech_prob": 0.002472399268299341}, {"id": 496, "seek": 231276, "start": 2312.76, "end": 2317.32, "text": " Google and Broadcom are are locked together in a pretty deep supply chain partnership that goes", "tokens": [50364, 3329, 293, 14074, 1112, 366, 366, 9376, 1214, 294, 257, 1238, 2452, 5847, 5021, 9982, 300, 1709, 50592], "temperature": 0.0, "avg_logprob": -0.10129984638147187, "compression_ratio": 1.6735395189003437, "no_speech_prob": 0.0008425943087786436}, {"id": 497, "seek": 231276, "start": 2317.32, "end": 2324.76, "text": " out to 2030 or 2031. Basically, it means that Google is committing to using Broadcom for all its", "tokens": [50592, 484, 281, 28638, 420, 945, 12967, 13, 8537, 11, 309, 1355, 300, 3329, 307, 26659, 281, 1228, 14074, 1112, 337, 439, 1080, 50964], "temperature": 0.0, "avg_logprob": -0.10129984638147187, "compression_ratio": 1.6735395189003437, "no_speech_prob": 0.0008425943087786436}, {"id": 498, "seek": 231276, "start": 2324.76, "end": 2331.0800000000004, "text": " TPU related work. So famously, Broadcom was the the partner that Google chose to design the TPU", "tokens": [50964, 314, 8115, 4077, 589, 13, 407, 34360, 11, 14074, 1112, 390, 264, 264, 4975, 300, 3329, 5111, 281, 1715, 264, 314, 8115, 51280], "temperature": 0.0, "avg_logprob": -0.10129984638147187, "compression_ratio": 1.6735395189003437, "no_speech_prob": 0.0008425943087786436}, {"id": 499, "seek": 231276, "start": 2331.0800000000004, "end": 2336.0400000000004, "text": " in the first place. And they're sticking with Broadcom. And this is an incredible level of stickiness", "tokens": [51280, 294, 264, 700, 1081, 13, 400, 436, 434, 13465, 365, 14074, 1112, 13, 400, 341, 307, 364, 4651, 1496, 295, 2897, 1324, 51528], "temperature": 0.0, "avg_logprob": -0.10129984638147187, "compression_ratio": 1.6735395189003437, "no_speech_prob": 0.0008425943087786436}, {"id": 500, "seek": 231276, "start": 2336.0400000000004, "end": 2340.36, "text": " for something that you might have expected naively would end up getting taken at house. Broadcom", "tokens": [51528, 337, 746, 300, 291, 1062, 362, 5176, 1667, 3413, 576, 917, 493, 1242, 2726, 412, 1782, 13, 14074, 1112, 51744], "temperature": 0.0, "avg_logprob": -0.10129984638147187, "compression_ratio": 1.6735395189003437, "no_speech_prob": 0.0008425943087786436}, {"id": 501, "seek": 234036, "start": 2340.36, "end": 2346.6, "text": " strengths are on helping with design and also on navigating supply chains for chip manufacturers.", "tokens": [50364, 16986, 366, 322, 4315, 365, 1715, 293, 611, 322, 32054, 5847, 12626, 337, 11409, 18455, 13, 50676], "temperature": 0.0, "avg_logprob": -0.15126350469756544, "compression_ratio": 1.6931407942238268, "no_speech_prob": 0.014500994235277176}, {"id": 502, "seek": 234036, "start": 2346.6, "end": 2351.1600000000003, "text": " So they really kind of take the design off of Google's desk, makes them optimizations,", "tokens": [50676, 407, 436, 534, 733, 295, 747, 264, 1715, 766, 295, 3329, 311, 10026, 11, 1669, 552, 5028, 14455, 11, 50904], "temperature": 0.0, "avg_logprob": -0.15126350469756544, "compression_ratio": 1.6931407942238268, "no_speech_prob": 0.014500994235277176}, {"id": 503, "seek": 234036, "start": 2351.1600000000003, "end": 2354.36, "text": " and then basically take it from there and say, Hey, we'll handle the supply chains. You know,", "tokens": [50904, 293, 550, 1936, 747, 309, 490, 456, 293, 584, 11, 1911, 11, 321, 603, 4813, 264, 5847, 12626, 13, 509, 458, 11, 51064], "temperature": 0.0, "avg_logprob": -0.15126350469756544, "compression_ratio": 1.6931407942238268, "no_speech_prob": 0.014500994235277176}, {"id": 504, "seek": 234036, "start": 2354.36, "end": 2358.76, "text": " we'll we'll do the actual kind of manufacturing side as well. So there's a lot going on there.", "tokens": [51064, 321, 603, 321, 603, 360, 264, 3539, 733, 295, 11096, 1252, 382, 731, 13, 407, 456, 311, 257, 688, 516, 322, 456, 13, 51284], "temperature": 0.0, "avg_logprob": -0.15126350469756544, "compression_ratio": 1.6931407942238268, "no_speech_prob": 0.014500994235277176}, {"id": 505, "seek": 234036, "start": 2358.76, "end": 2363.2400000000002, "text": " Obviously Broadcom's talked popped on this news. No, no surprise there. Last thing to note too,", "tokens": [51284, 7580, 14074, 1112, 311, 2825, 21545, 322, 341, 2583, 13, 883, 11, 572, 6365, 456, 13, 5264, 551, 281, 3637, 886, 11, 51508], "temperature": 0.0, "avg_logprob": -0.15126350469756544, "compression_ratio": 1.6931407942238268, "no_speech_prob": 0.014500994235277176}, {"id": 506, "seek": 236324, "start": 2363.9599999999996, "end": 2370.7599999999998, "text": " you know, Google and Thropic, this is Anthropic basically proving out at scale that Google's stack,", "tokens": [50400, 291, 458, 11, 3329, 293, 334, 39173, 11, 341, 307, 12727, 39173, 1936, 27221, 484, 412, 4373, 300, 3329, 311, 8630, 11, 50740], "temperature": 0.0, "avg_logprob": -0.2018581292568109, "compression_ratio": 1.6379310344827587, "no_speech_prob": 0.006691013462841511}, {"id": 507, "seek": 236324, "start": 2370.7599999999998, "end": 2375.7999999999997, "text": " their TPU stack can compete with Nvidia at scale, right? That's a really, really big deal.", "tokens": [50740, 641, 314, 8115, 8630, 393, 11831, 365, 46284, 412, 4373, 11, 558, 30, 663, 311, 257, 534, 11, 534, 955, 2028, 13, 50992], "temperature": 0.0, "avg_logprob": -0.2018581292568109, "compression_ratio": 1.6379310344827587, "no_speech_prob": 0.006691013462841511}, {"id": 508, "seek": 236324, "start": 2375.7999999999997, "end": 2380.7599999999998, "text": " This is Google saying, Hey, you see that big juicy market chair in video being the world's", "tokens": [50992, 639, 307, 3329, 1566, 11, 1911, 11, 291, 536, 300, 955, 24696, 2142, 6090, 294, 960, 885, 264, 1002, 311, 51240], "temperature": 0.0, "avg_logprob": -0.2018581292568109, "compression_ratio": 1.6379310344827587, "no_speech_prob": 0.006691013462841511}, {"id": 509, "seek": 236324, "start": 2380.7599999999998, "end": 2386.2799999999997, "text": " most valuable company. Well, we can play that game too. And really the question is, you've got all", "tokens": [51240, 881, 8263, 2237, 13, 1042, 11, 321, 393, 862, 300, 1216, 886, 13, 400, 534, 264, 1168, 307, 11, 291, 600, 658, 439, 51516], "temperature": 0.0, "avg_logprob": -0.2018581292568109, "compression_ratio": 1.6379310344827587, "no_speech_prob": 0.006691013462841511}, {"id": 510, "seek": 236324, "start": 2386.2799999999997, "end": 2389.7999999999997, "text": " these agents running around all these model development companies like OpenAI, you know, like,", "tokens": [51516, 613, 12554, 2614, 926, 439, 613, 2316, 3250, 3431, 411, 7238, 48698, 11, 291, 458, 11, 411, 11, 51692], "temperature": 0.0, "avg_logprob": -0.2018581292568109, "compression_ratio": 1.6379310344827587, "no_speech_prob": 0.006691013462841511}, {"id": 511, "seek": 238980, "start": 2389.88, "end": 2394.76, "text": " well, Google actually, but you know, how many companies actually design and ship good chips,", "tokens": [50368, 731, 11, 3329, 767, 11, 457, 291, 458, 11, 577, 867, 3431, 767, 1715, 293, 5374, 665, 11583, 11, 50612], "temperature": 0.0, "avg_logprob": -0.16039616352803, "compression_ratio": 1.6215277777777777, "no_speech_prob": 0.002934211865067482}, {"id": 512, "seek": 238980, "start": 2394.76, "end": 2399.32, "text": " Google has been doing TPUs for a long time. They are performant. Total cost of ownership", "tokens": [50612, 3329, 575, 668, 884, 314, 8115, 82, 337, 257, 938, 565, 13, 814, 366, 2042, 394, 13, 23170, 2063, 295, 15279, 50840], "temperature": 0.0, "avg_logprob": -0.16039616352803, "compression_ratio": 1.6215277777777777, "no_speech_prob": 0.002934211865067482}, {"id": 513, "seek": 238980, "start": 2399.32, "end": 2404.28, "text": " looks good. Like, there's a lot of reasons to look at TPUs and Anthropic is just basically", "tokens": [50840, 1542, 665, 13, 1743, 11, 456, 311, 257, 688, 295, 4112, 281, 574, 412, 314, 8115, 82, 293, 12727, 39173, 307, 445, 1936, 51088], "temperature": 0.0, "avg_logprob": -0.16039616352803, "compression_ratio": 1.6215277777777777, "no_speech_prob": 0.002934211865067482}, {"id": 514, "seek": 238980, "start": 2404.28, "end": 2408.92, "text": " making that case at scale and allowing Google a really solid marketing win for more infrastructure", "tokens": [51088, 1455, 300, 1389, 412, 4373, 293, 8293, 3329, 257, 534, 5100, 6370, 1942, 337, 544, 6896, 51320], "temperature": 0.0, "avg_logprob": -0.16039616352803, "compression_ratio": 1.6215277777777777, "no_speech_prob": 0.002934211865067482}, {"id": 515, "seek": 238980, "start": 2408.92, "end": 2416.6800000000003, "text": " contract. Right. And in the blog post, they also do say that Amazon remains their primary cloud", "tokens": [51320, 4364, 13, 1779, 13, 400, 294, 264, 6968, 2183, 11, 436, 611, 360, 584, 300, 6795, 7023, 641, 6194, 4588, 51708], "temperature": 0.0, "avg_logprob": -0.16039616352803, "compression_ratio": 1.6215277777777777, "no_speech_prob": 0.002934211865067482}, {"id": 516, "seek": 241668, "start": 2416.68, "end": 2423.3999999999996, "text": " provider and training partner. So this is also kind of in a way similar to OpenAI where", "tokens": [50364, 12398, 293, 3097, 4975, 13, 407, 341, 307, 611, 733, 295, 294, 257, 636, 2531, 281, 7238, 48698, 689, 50700], "temperature": 0.0, "avg_logprob": -0.22685018498846826, "compression_ratio": 1.5365853658536586, "no_speech_prob": 0.007008688990026712}, {"id": 517, "seek": 241668, "start": 2423.3999999999996, "end": 2429.48, "text": " originally everybody, buddy, but even Microsoft and Thropic was buddy, buddy with Amazon now they", "tokens": [50700, 7993, 2201, 11, 10340, 11, 457, 754, 8116, 293, 334, 39173, 390, 10340, 11, 10340, 365, 6795, 586, 436, 51004], "temperature": 0.0, "avg_logprob": -0.22685018498846826, "compression_ratio": 1.5365853658536586, "no_speech_prob": 0.007008688990026712}, {"id": 518, "seek": 241668, "start": 2429.48, "end": 2436.04, "text": " need to expand out just to get access to more compute. And at the US, Amazon also has their whole", "tokens": [51004, 643, 281, 5268, 484, 445, 281, 483, 2105, 281, 544, 14722, 13, 400, 412, 264, 2546, 11, 6795, 611, 575, 641, 1379, 51332], "temperature": 0.0, "avg_logprob": -0.22685018498846826, "compression_ratio": 1.5365853658536586, "no_speech_prob": 0.007008688990026712}, {"id": 519, "seek": 241668, "start": 2436.04, "end": 2443.24, "text": " training. I'm hardware, which to my knowledge is not anywhere near where TPUs are at. So could", "tokens": [51332, 3097, 13, 286, 478, 1152, 3039, 11, 597, 281, 452, 3601, 307, 406, 4992, 2651, 689, 314, 8115, 82, 366, 412, 13, 407, 727, 51692], "temperature": 0.0, "avg_logprob": -0.22685018498846826, "compression_ratio": 1.5365853658536586, "no_speech_prob": 0.007008688990026712}, {"id": 520, "seek": 244324, "start": 2443.24, "end": 2449.8799999999997, "text": " you putting a little bit of pressure on Amazon to deliver on the hardware side as well, because", "tokens": [50364, 291, 3372, 257, 707, 857, 295, 3321, 322, 6795, 281, 4239, 322, 264, 8837, 1252, 382, 731, 11, 570, 50696], "temperature": 0.0, "avg_logprob": -0.22267673925026177, "compression_ratio": 1.5327868852459017, "no_speech_prob": 0.0068994928151369095}, {"id": 521, "seek": 244324, "start": 2449.8799999999997, "end": 2454.7599999999998, "text": " I'm sure they would be happy to give Anthropic all the computers so that they could", "tokens": [50696, 286, 478, 988, 436, 576, 312, 2055, 281, 976, 12727, 39173, 439, 264, 10807, 370, 300, 436, 727, 50940], "temperature": 0.0, "avg_logprob": -0.22267673925026177, "compression_ratio": 1.5327868852459017, "no_speech_prob": 0.0068994928151369095}, {"id": 522, "seek": 244324, "start": 2454.7599999999998, "end": 2462.3599999999997, "text": " array into cash. And now onto an OpenAI story, not new so much, but a worthwhile article to touch", "tokens": [50940, 10225, 666, 6388, 13, 400, 586, 3911, 364, 7238, 48698, 1657, 11, 406, 777, 370, 709, 11, 457, 257, 28159, 7222, 281, 2557, 51320], "temperature": 0.0, "avg_logprob": -0.22267673925026177, "compression_ratio": 1.5327868852459017, "no_speech_prob": 0.0068994928151369095}, {"id": 523, "seek": 244324, "start": 2462.3599999999997, "end": 2468.2799999999997, "text": " on. If it's just came out like a day or two ago in the New Yorker, there's a very, very detailed", "tokens": [51320, 322, 13, 759, 309, 311, 445, 1361, 484, 411, 257, 786, 420, 732, 2057, 294, 264, 1873, 3609, 260, 11, 456, 311, 257, 588, 11, 588, 9942, 51616], "temperature": 0.0, "avg_logprob": -0.22267673925026177, "compression_ratio": 1.5327868852459017, "no_speech_prob": 0.0068994928151369095}, {"id": 524, "seek": 246828, "start": 2468.28, "end": 2476.1200000000003, "text": " piece titled Sam Ottoman may control our future. Can he be trusted? And this is basically sort of a", "tokens": [50364, 2522, 19841, 4832, 33435, 815, 1969, 527, 2027, 13, 1664, 415, 312, 16034, 30, 400, 341, 307, 1936, 1333, 295, 257, 50756], "temperature": 0.0, "avg_logprob": -0.18969929218292236, "compression_ratio": 1.5026178010471205, "no_speech_prob": 0.03960077837109566}, {"id": 525, "seek": 246828, "start": 2476.1200000000003, "end": 2484.44, "text": " survey of impressions or first hand accounts of interactions with Sam Ottoman, particularly", "tokens": [50756, 8984, 295, 24245, 420, 700, 1011, 9402, 295, 13280, 365, 4832, 33435, 11, 4098, 51172], "temperature": 0.0, "avg_logprob": -0.18969929218292236, "compression_ratio": 1.5026178010471205, "no_speech_prob": 0.03960077837109566}, {"id": 526, "seek": 246828, "start": 2484.44, "end": 2492.52, "text": " focusing on the question of is he trustworthy? Does he lie all the time? Centering a lot around", "tokens": [51172, 8416, 322, 264, 1168, 295, 307, 415, 39714, 30, 4402, 415, 4544, 439, 264, 565, 30, 3408, 1794, 257, 688, 926, 51576], "temperature": 0.0, "avg_logprob": -0.18969929218292236, "compression_ratio": 1.5026178010471205, "no_speech_prob": 0.03960077837109566}, {"id": 527, "seek": 249252, "start": 2492.52, "end": 2499.56, "text": " his firing from OpenAI in late 2023. If people aren't aware of that story at the time, that was this", "tokens": [50364, 702, 16045, 490, 7238, 48698, 294, 3469, 44377, 13, 759, 561, 3212, 380, 3650, 295, 300, 1657, 412, 264, 565, 11, 300, 390, 341, 50716], "temperature": 0.0, "avg_logprob": -0.31122149417274875, "compression_ratio": 1.5450980392156863, "no_speech_prob": 0.20289146900177002}, {"id": 528, "seek": 249252, "start": 2499.56, "end": 2506.68, "text": " big, big, big drama where the opening I board fired Sam Ottoman as CEO, but he's closed like in", "tokens": [50716, 955, 11, 955, 11, 955, 9412, 689, 264, 5193, 286, 3150, 11777, 4832, 33435, 382, 9282, 11, 457, 415, 311, 5395, 411, 294, 51072], "temperature": 0.0, "avg_logprob": -0.31122149417274875, "compression_ratio": 1.5450980392156863, "no_speech_prob": 0.20289146900177002}, {"id": 529, "seek": 249252, "start": 2506.68, "end": 2512.36, "text": " this statement, they just said that he was not quote consistently candid in his communications or", "tokens": [51072, 341, 5629, 11, 436, 445, 848, 300, 415, 390, 406, 6513, 14961, 6268, 294, 702, 15163, 420, 51356], "temperature": 0.0, "avg_logprob": -0.31122149417274875, "compression_ratio": 1.5450980392156863, "no_speech_prob": 0.20289146900177002}, {"id": 530, "seek": 249252, "start": 2512.36, "end": 2518.12, "text": " something like that. And it was a very sort of mysterious thing of like, very fighting him for what", "tokens": [51356, 746, 411, 300, 13, 400, 309, 390, 257, 588, 1333, 295, 13831, 551, 295, 411, 11, 588, 5237, 796, 337, 437, 51644], "temperature": 0.0, "avg_logprob": -0.31122149417274875, "compression_ratio": 1.5450980392156863, "no_speech_prob": 0.20289146900177002}, {"id": 531, "seek": 251812, "start": 2518.52, "end": 2524.2799999999997, "text": " like not being consistently honest at the time, it was like always this political maneuvering.", "tokens": [50384, 411, 406, 885, 14961, 3245, 412, 264, 565, 11, 309, 390, 411, 1009, 341, 3905, 25976, 278, 13, 50672], "temperature": 0.0, "avg_logprob": -0.11141917217208679, "compression_ratio": 1.651063829787234, "no_speech_prob": 0.008962421678006649}, {"id": 532, "seek": 251812, "start": 2524.2799999999997, "end": 2532.44, "text": " What came out since then has painted a picture of him being a manipulative kind of business person", "tokens": [50672, 708, 1361, 484, 1670, 550, 575, 11797, 257, 3036, 295, 796, 885, 257, 9258, 22678, 733, 295, 1606, 954, 51080], "temperature": 0.0, "avg_logprob": -0.11141917217208679, "compression_ratio": 1.651063829787234, "no_speech_prob": 0.008962421678006649}, {"id": 533, "seek": 251812, "start": 2532.44, "end": 2537.72, "text": " where he says different things to different people depending on the context. He says things that", "tokens": [51080, 689, 415, 1619, 819, 721, 281, 819, 561, 5413, 322, 264, 4319, 13, 634, 1619, 721, 300, 51344], "temperature": 0.0, "avg_logprob": -0.11141917217208679, "compression_ratio": 1.651063829787234, "no_speech_prob": 0.008962421678006649}, {"id": 534, "seek": 251812, "start": 2537.72, "end": 2546.44, "text": " may not be entirely true or exaggerations. And this piece basically adds in to that picture where", "tokens": [51344, 815, 406, 312, 7696, 2074, 420, 19123, 763, 13, 400, 341, 2522, 1936, 10860, 294, 281, 300, 3036, 689, 51780], "temperature": 0.0, "avg_logprob": -0.11141917217208679, "compression_ratio": 1.651063829787234, "no_speech_prob": 0.008962421678006649}, {"id": 535, "seek": 254644, "start": 2546.44, "end": 2554.36, "text": " if you go back to his time as CEO of a startup, if you go back to him leading white combinator,", "tokens": [50364, 498, 291, 352, 646, 281, 702, 565, 382, 9282, 295, 257, 18578, 11, 498, 291, 352, 646, 281, 796, 5775, 2418, 2512, 31927, 11, 50760], "temperature": 0.0, "avg_logprob": -0.11726973807975037, "compression_ratio": 1.5502645502645502, "no_speech_prob": 0.0023567734751850367}, {"id": 536, "seek": 254644, "start": 2554.36, "end": 2560.92, "text": " if you go to recent years, there is a pattern of Sam Ottoman by many accounts of different people", "tokens": [50760, 498, 291, 352, 281, 5162, 924, 11, 456, 307, 257, 5102, 295, 4832, 33435, 538, 867, 9402, 295, 819, 561, 51088], "temperature": 0.0, "avg_logprob": -0.11726973807975037, "compression_ratio": 1.5502645502645502, "no_speech_prob": 0.0023567734751850367}, {"id": 537, "seek": 254644, "start": 2561.8, "end": 2570.92, "text": " not being honest, like just saying things that aren't true to gain advantage or to gain more power.", "tokens": [51132, 406, 885, 3245, 11, 411, 445, 1566, 721, 300, 3212, 380, 2074, 281, 6052, 5002, 420, 281, 6052, 544, 1347, 13, 51588], "temperature": 0.0, "avg_logprob": -0.11726973807975037, "compression_ratio": 1.5502645502645502, "no_speech_prob": 0.0023567734751850367}, {"id": 538, "seek": 257092, "start": 2571.4, "end": 2577.16, "text": " Another kind of part of this is questioning whether Sam Ottoman's bribe is to accumulate power", "tokens": [50388, 3996, 733, 295, 644, 295, 341, 307, 21257, 1968, 4832, 33435, 311, 33713, 650, 307, 281, 33384, 1347, 50676], "temperature": 0.0, "avg_logprob": -0.22879171932444853, "compression_ratio": 1.5819672131147542, "no_speech_prob": 0.009843138046562672}, {"id": 539, "seek": 257092, "start": 2577.16, "end": 2584.44, "text": " essentially. So very, very detailed, deeply researched piece, I would recommend reading it if you", "tokens": [50676, 4476, 13, 407, 588, 11, 588, 9942, 11, 8760, 37098, 2522, 11, 286, 576, 2748, 3760, 309, 498, 291, 51040], "temperature": 0.0, "avg_logprob": -0.22879171932444853, "compression_ratio": 1.5819672131147542, "no_speech_prob": 0.009843138046562672}, {"id": 540, "seek": 257092, "start": 2584.44, "end": 2591.0, "text": " find this interesting, not much new in terms of like actual news reporting, where some tidbits", "tokens": [51040, 915, 341, 1880, 11, 406, 709, 777, 294, 2115, 295, 411, 3539, 2583, 10031, 11, 689, 512, 9422, 34010, 51368], "temperature": 0.0, "avg_logprob": -0.22879171932444853, "compression_ratio": 1.5819672131147542, "no_speech_prob": 0.009843138046562672}, {"id": 541, "seek": 257092, "start": 2591.0, "end": 2596.92, "text": " are sort of at the picture that was already present at least for many of Sam Ottoman clearly being", "tokens": [51368, 366, 1333, 295, 412, 264, 3036, 300, 390, 1217, 1974, 412, 1935, 337, 867, 295, 4832, 33435, 4448, 885, 51664], "temperature": 0.0, "avg_logprob": -0.22879171932444853, "compression_ratio": 1.5819672131147542, "no_speech_prob": 0.009843138046562672}, {"id": 542, "seek": 259692, "start": 2597.56, "end": 2603.8, "text": " flexible with troops depending on context. Moving on, a story where OpenAI and", "tokens": [50396, 11358, 365, 11522, 5413, 322, 4319, 13, 14242, 322, 11, 257, 1657, 689, 7238, 48698, 293, 50708], "temperature": 0.0, "avg_logprob": -0.31252665519714357, "compression_ratio": 1.5128205128205128, "no_speech_prob": 0.020586906000971794}, {"id": 543, "seek": 259692, "start": 2603.8, "end": 2611.88, "text": " Fropik are working together and Google, they're uniting to combat model copying in China. So they're", "tokens": [50708, 479, 1513, 1035, 366, 1364, 1214, 293, 3329, 11, 436, 434, 517, 1748, 281, 8361, 2316, 27976, 294, 3533, 13, 407, 436, 434, 51112], "temperature": 0.0, "avg_logprob": -0.31252665519714357, "compression_ratio": 1.5128205128205128, "no_speech_prob": 0.020586906000971794}, {"id": 544, "seek": 259692, "start": 2612.52, "end": 2618.44, "text": " apparently working together to fight against this adversarial distillation. They have", "tokens": [51144, 7970, 1364, 1214, 281, 2092, 1970, 341, 17641, 44745, 42923, 399, 13, 814, 362, 51440], "temperature": 0.0, "avg_logprob": -0.31252665519714357, "compression_ratio": 1.5128205128205128, "no_speech_prob": 0.020586906000971794}, {"id": 545, "seek": 259692, "start": 2618.44, "end": 2625.88, "text": " frontier model orm and industry nonprofits that both three companies co-founded in 2023.", "tokens": [51440, 35853, 2316, 420, 76, 293, 3518, 42851, 300, 1293, 1045, 3431, 598, 12, 49547, 294, 44377, 13, 51812], "temperature": 0.0, "avg_logprob": -0.31252665519714357, "compression_ratio": 1.5128205128205128, "no_speech_prob": 0.020586906000971794}, {"id": 546, "seek": 262588, "start": 2626.52, "end": 2634.2000000000003, "text": " And they essentially are seemingly going to share intelligence and coordinate to somehow avoid", "tokens": [50396, 400, 436, 4476, 366, 18709, 516, 281, 2073, 7599, 293, 15670, 281, 6063, 5042, 50780], "temperature": 0.0, "avg_logprob": -0.14274446463879245, "compression_ratio": 1.5387755102040817, "no_speech_prob": 0.006369606591761112}, {"id": 547, "seek": 262588, "start": 2634.2000000000003, "end": 2639.88, "text": " this happening we saw in Fropik announcing what seemed to be pretty large scale. You could", "tokens": [50780, 341, 2737, 321, 1866, 294, 479, 1513, 1035, 28706, 437, 6576, 281, 312, 1238, 2416, 4373, 13, 509, 727, 51064], "temperature": 0.0, "avg_logprob": -0.14274446463879245, "compression_ratio": 1.5387755102040817, "no_speech_prob": 0.006369606591761112}, {"id": 548, "seek": 262588, "start": 2639.88, "end": 2646.12, "text": " characterize them as attacks attempts to distill models by extracting outputs. You know, if it doesn't", "tokens": [51064, 38463, 552, 382, 8122, 15257, 281, 42923, 5245, 538, 49844, 23930, 13, 509, 458, 11, 498, 309, 1177, 380, 51376], "temperature": 0.0, "avg_logprob": -0.14274446463879245, "compression_ratio": 1.5387755102040817, "no_speech_prob": 0.006369606591761112}, {"id": 549, "seek": 262588, "start": 2646.12, "end": 2652.92, "text": " fall in line with their terms of use. So an interesting development here of the US-based", "tokens": [51376, 2100, 294, 1622, 365, 641, 2115, 295, 764, 13, 407, 364, 1880, 3250, 510, 295, 264, 2546, 12, 6032, 51716], "temperature": 0.0, "avg_logprob": -0.14274446463879245, "compression_ratio": 1.5387755102040817, "no_speech_prob": 0.006369606591761112}, {"id": 550, "seek": 265292, "start": 2652.92, "end": 2658.36, "text": " companies coordinating on this particular problem. Yeah, the whole idea here is basically just", "tokens": [50364, 3431, 37824, 322, 341, 1729, 1154, 13, 865, 11, 264, 1379, 1558, 510, 307, 1936, 445, 50636], "temperature": 0.0, "avg_logprob": -0.14620071544981839, "compression_ratio": 1.6109215017064846, "no_speech_prob": 0.010161945596337318}, {"id": 551, "seek": 265292, "start": 2659.08, "end": 2663.4, "text": " flagging, you know, when one company detects some kind of attack pattern, they flag it for the", "tokens": [50672, 7166, 3249, 11, 291, 458, 11, 562, 472, 2237, 5531, 82, 512, 733, 295, 2690, 5102, 11, 436, 7166, 309, 337, 264, 50888], "temperature": 0.0, "avg_logprob": -0.14620071544981839, "compression_ratio": 1.6109215017064846, "no_speech_prob": 0.010161945596337318}, {"id": 552, "seek": 265292, "start": 2663.4, "end": 2668.12, "text": " others, right? So nice and simple, very concrete. And well, I mean, it's concrete because the", "tokens": [50888, 2357, 11, 558, 30, 407, 1481, 293, 2199, 11, 588, 9859, 13, 400, 731, 11, 286, 914, 11, 309, 311, 9859, 570, 264, 51124], "temperature": 0.0, "avg_logprob": -0.14620071544981839, "compression_ratio": 1.6109215017064846, "no_speech_prob": 0.010161945596337318}, {"id": 553, "seek": 265292, "start": 2668.12, "end": 2673.32, "text": " incentives are so so aligned here. It's worth noting that the FNF, the frontier model forum,", "tokens": [51124, 23374, 366, 370, 370, 17962, 510, 13, 467, 311, 3163, 26801, 300, 264, 479, 45, 37, 11, 264, 35853, 2316, 17542, 11, 51384], "temperature": 0.0, "avg_logprob": -0.14620071544981839, "compression_ratio": 1.6109215017064846, "no_speech_prob": 0.010161945596337318}, {"id": 554, "seek": 265292, "start": 2673.96, "end": 2681.2400000000002, "text": " kind of had been quite a toothless coordinating body. And at least for the safety function that", "tokens": [51416, 733, 295, 632, 668, 1596, 257, 11680, 1832, 37824, 1772, 13, 400, 412, 1935, 337, 264, 4514, 2445, 300, 51780], "temperature": 0.0, "avg_logprob": -0.14620071544981839, "compression_ratio": 1.6109215017064846, "no_speech_prob": 0.010161945596337318}, {"id": 555, "seek": 268124, "start": 2681.24, "end": 2685.56, "text": " so many people were excited about. But at least on this one, it seems like it's actually going", "tokens": [50364, 370, 867, 561, 645, 2919, 466, 13, 583, 412, 1935, 322, 341, 472, 11, 309, 2544, 411, 309, 311, 767, 516, 50580], "temperature": 0.0, "avg_logprob": -0.1677430163147629, "compression_ratio": 1.5317460317460319, "no_speech_prob": 0.0018285824917256832}, {"id": 556, "seek": 268124, "start": 2685.56, "end": 2691.4799999999996, "text": " places and doing things. So that's kind of an interesting update. Next on to chips. Chinese", "tokens": [50580, 3190, 293, 884, 721, 13, 407, 300, 311, 733, 295, 364, 1880, 5623, 13, 3087, 322, 281, 11583, 13, 4649, 50876], "temperature": 0.0, "avg_logprob": -0.1677430163147629, "compression_ratio": 1.5317460317460319, "no_speech_prob": 0.0018285824917256832}, {"id": 557, "seek": 268124, "start": 2691.4799999999996, "end": 2699.16, "text": " chipmakers claim nearly half of local market as Nvidia's lead shrinks. So the numbers here are", "tokens": [50876, 11409, 76, 19552, 3932, 6217, 1922, 295, 2654, 2142, 382, 46284, 311, 1477, 9884, 16431, 13, 407, 264, 3547, 510, 366, 51260], "temperature": 0.0, "avg_logprob": -0.1677430163147629, "compression_ratio": 1.5317460317460319, "no_speech_prob": 0.0018285824917256832}, {"id": 558, "seek": 268124, "start": 2699.16, "end": 2707.4799999999996, "text": " that Chinese GPU and AI chip makers captured nearly 41% of China's AI accelerator server market in 2025.", "tokens": [51260, 300, 4649, 18407, 293, 7318, 11409, 19323, 11828, 6217, 18173, 4, 295, 3533, 311, 7318, 39889, 7154, 2142, 294, 39209, 13, 51676], "temperature": 0.0, "avg_logprob": -0.1677430163147629, "compression_ratio": 1.5317460317460319, "no_speech_prob": 0.0018285824917256832}, {"id": 559, "seek": 270748, "start": 2707.56, "end": 2716.28, "text": " According to an IDC report reviewed by Reuters here, this is as Chinese companies have continued", "tokens": [50368, 7328, 281, 364, 7348, 34, 2275, 18429, 538, 1300, 48396, 510, 11, 341, 307, 382, 4649, 3431, 362, 7014, 50804], "temperature": 0.0, "avg_logprob": -0.13697313458732005, "compression_ratio": 1.4827586206896552, "no_speech_prob": 0.014839128591120243}, {"id": 560, "seek": 270748, "start": 2716.28, "end": 2723.08, "text": " to try to purchase Nvidia chips despite expert controls and kind of inconsistent policy on this", "tokens": [50804, 281, 853, 281, 8110, 46284, 11583, 7228, 5844, 9003, 293, 733, 295, 36891, 3897, 322, 341, 51144], "temperature": 0.0, "avg_logprob": -0.13697313458732005, "compression_ratio": 1.4827586206896552, "no_speech_prob": 0.014839128591120243}, {"id": 561, "seek": 270748, "start": 2723.08, "end": 2730.76, "text": " front. And Huawei, of course, is leading a pack with about half of all the Chinese vendors being", "tokens": [51144, 1868, 13, 400, 28542, 11, 295, 1164, 11, 307, 5775, 257, 2844, 365, 466, 1922, 295, 439, 264, 4649, 22056, 885, 51528], "temperature": 0.0, "avg_logprob": -0.13697313458732005, "compression_ratio": 1.4827586206896552, "no_speech_prob": 0.014839128591120243}, {"id": 562, "seek": 270748, "start": 2730.76, "end": 2737.08, "text": " shipped. AMD holding just 4% of a market, apparently, which I found interesting. But I'm sure you", "tokens": [51528, 25312, 13, 34808, 5061, 445, 1017, 4, 295, 257, 2142, 11, 7970, 11, 597, 286, 1352, 1880, 13, 583, 286, 478, 988, 291, 51844], "temperature": 0.0, "avg_logprob": -0.13697313458732005, "compression_ratio": 1.4827586206896552, "no_speech_prob": 0.014839128591120243}, {"id": 563, "seek": 273708, "start": 2737.08, "end": 2741.96, "text": " can say more on this journey. Yeah, I mean, well, so first of all, I think there's a risk that", "tokens": [50364, 393, 584, 544, 322, 341, 4671, 13, 865, 11, 286, 914, 11, 731, 11, 370, 700, 295, 439, 11, 286, 519, 456, 311, 257, 3148, 300, 50608], "temperature": 0.0, "avg_logprob": -0.08541365920520219, "compression_ratio": 1.6526315789473685, "no_speech_prob": 0.001080924877896905}, {"id": 564, "seek": 273708, "start": 2742.52, "end": 2747.64, "text": " this gets taken to be yet another one of those arguments for why it was bad to have export", "tokens": [50636, 341, 2170, 2726, 281, 312, 1939, 1071, 472, 295, 729, 12869, 337, 983, 309, 390, 1578, 281, 362, 10725, 50892], "temperature": 0.0, "avg_logprob": -0.08541365920520219, "compression_ratio": 1.6526315789473685, "no_speech_prob": 0.001080924877896905}, {"id": 565, "seek": 273708, "start": 2747.64, "end": 2753.0, "text": " controls. Obviously, this was always going to be the result of export controls, right? You tell", "tokens": [50892, 9003, 13, 7580, 11, 341, 390, 1009, 516, 281, 312, 264, 1874, 295, 10725, 9003, 11, 558, 30, 509, 980, 51160], "temperature": 0.0, "avg_logprob": -0.08541365920520219, "compression_ratio": 1.6526315789473685, "no_speech_prob": 0.001080924877896905}, {"id": 566, "seek": 273708, "start": 2753.0, "end": 2757.64, "text": " Nvidia they can't sell GPUs, the Chinese market, or at least that they can't sell their top line", "tokens": [51160, 46284, 436, 393, 380, 3607, 18407, 82, 11, 264, 4649, 2142, 11, 420, 412, 1935, 300, 436, 393, 380, 3607, 641, 1192, 1622, 51392], "temperature": 0.0, "avg_logprob": -0.08541365920520219, "compression_ratio": 1.6526315789473685, "no_speech_prob": 0.001080924877896905}, {"id": 567, "seek": 273708, "start": 2757.64, "end": 2762.6, "text": " GPUs. Eventually, whatever the bar is that you set for how good those GPUs have to be before", "tokens": [51392, 18407, 82, 13, 17586, 11, 2035, 264, 2159, 307, 300, 291, 992, 337, 577, 665, 729, 18407, 82, 362, 281, 312, 949, 51640], "temperature": 0.0, "avg_logprob": -0.08541365920520219, "compression_ratio": 1.6526315789473685, "no_speech_prob": 0.001080924877896905}, {"id": 568, "seek": 276260, "start": 2762.6, "end": 2767.3199999999997, "text": " they can be shipped, Huawei is going to slowly and then eventually incrementally exceed it, right?", "tokens": [50364, 436, 393, 312, 25312, 11, 28542, 307, 516, 281, 5692, 293, 550, 4728, 26200, 379, 14048, 309, 11, 558, 30, 50600], "temperature": 0.0, "avg_logprob": -0.1555152136432238, "compression_ratio": 1.599337748344371, "no_speech_prob": 0.00512502109631896}, {"id": 569, "seek": 276260, "start": 2767.3199999999997, "end": 2773.08, "text": " So we were always going to get here. There's also this issue just of capacity. So Huawei has SMIC,", "tokens": [50600, 407, 321, 645, 1009, 516, 281, 483, 510, 13, 821, 311, 611, 341, 2734, 445, 295, 6042, 13, 407, 28542, 575, 13115, 2532, 11, 50888], "temperature": 0.0, "avg_logprob": -0.1555152136432238, "compression_ratio": 1.599337748344371, "no_speech_prob": 0.00512502109631896}, {"id": 570, "seek": 276260, "start": 2773.08, "end": 2778.2799999999997, "text": " which is China's version of TSMC basically the chip that is native to China that's helping them", "tokens": [50888, 597, 307, 3533, 311, 3037, 295, 314, 26693, 34, 1936, 264, 11409, 300, 307, 8470, 281, 3533, 300, 311, 4315, 552, 51148], "temperature": 0.0, "avg_logprob": -0.1555152136432238, "compression_ratio": 1.599337748344371, "no_speech_prob": 0.00512502109631896}, {"id": 571, "seek": 276260, "start": 2778.2799999999997, "end": 2782.12, "text": " pump out these chips. The yields are kind of shit, but Huawei's really good at chip design kind", "tokens": [51148, 5889, 484, 613, 11583, 13, 440, 32168, 366, 733, 295, 4611, 11, 457, 28542, 311, 534, 665, 412, 11409, 1715, 733, 51340], "temperature": 0.0, "avg_logprob": -0.1555152136432238, "compression_ratio": 1.599337748344371, "no_speech_prob": 0.00512502109631896}, {"id": 572, "seek": 276260, "start": 2782.12, "end": 2786.44, "text": " of makes up for it somewhat. And that's why you're seeing them hinge away. Now Nvidia has 55%", "tokens": [51340, 295, 1669, 493, 337, 309, 8344, 13, 400, 300, 311, 983, 291, 434, 2577, 552, 28822, 1314, 13, 823, 46284, 575, 12330, 4, 51556], "temperature": 0.0, "avg_logprob": -0.1555152136432238, "compression_ratio": 1.599337748344371, "no_speech_prob": 0.00512502109631896}, {"id": 573, "seek": 278644, "start": 2786.44, "end": 2792.2000000000003, "text": " market share now, but it's been, you know, that their market lead here has been whittled down to", "tokens": [50364, 2142, 2073, 586, 11, 457, 309, 311, 668, 11, 291, 458, 11, 300, 641, 2142, 1477, 510, 575, 668, 315, 593, 1493, 760, 281, 50652], "temperature": 0.0, "avg_logprob": -0.14233226463442944, "compression_ratio": 1.6166666666666667, "no_speech_prob": 0.01714351214468479}, {"id": 574, "seek": 278644, "start": 2792.2000000000003, "end": 2796.76, "text": " basically nearly half when they once were extremely dominant. Huawei is the runner up, right? So", "tokens": [50652, 1936, 6217, 1922, 562, 436, 1564, 645, 4664, 15657, 13, 28542, 307, 264, 24376, 493, 11, 558, 30, 407, 50880], "temperature": 0.0, "avg_logprob": -0.14233226463442944, "compression_ratio": 1.6166666666666667, "no_speech_prob": 0.01714351214468479}, {"id": 575, "seek": 278644, "start": 2796.76, "end": 2801.8, "text": " no surprise there. The current situation in China, there's a whole bunch of like just for China", "tokens": [50880, 572, 6365, 456, 13, 440, 2190, 2590, 294, 3533, 11, 456, 311, 257, 1379, 3840, 295, 411, 445, 337, 3533, 51132], "temperature": 0.0, "avg_logprob": -0.14233226463442944, "compression_ratio": 1.6166666666666667, "no_speech_prob": 0.01714351214468479}, {"id": 576, "seek": 278644, "start": 2801.8, "end": 2807.4, "text": " chips that had been launched, you know, the H20, the H800. More recently, Nvidia actually will be", "tokens": [51132, 11583, 300, 632, 668, 8730, 11, 291, 458, 11, 264, 389, 2009, 11, 264, 389, 14423, 13, 5048, 3938, 11, 46284, 767, 486, 312, 51412], "temperature": 0.0, "avg_logprob": -0.14233226463442944, "compression_ratio": 1.6166666666666667, "no_speech_prob": 0.01714351214468479}, {"id": 577, "seek": 278644, "start": 2807.4, "end": 2812.6, "text": " putting out a new one called the B30. So this is actually the black well, the black well made for", "tokens": [51412, 3372, 484, 257, 777, 472, 1219, 264, 363, 3446, 13, 407, 341, 307, 767, 264, 2211, 731, 11, 264, 2211, 731, 1027, 337, 51672], "temperature": 0.0, "avg_logprob": -0.14233226463442944, "compression_ratio": 1.6166666666666667, "no_speech_prob": 0.01714351214468479}, {"id": 578, "seek": 281260, "start": 2812.6, "end": 2819.0, "text": " China chip. But of course, the H200 now, the kind of not quite top line, but pretty damn good", "tokens": [50364, 3533, 11409, 13, 583, 295, 1164, 11, 264, 389, 7629, 586, 11, 264, 733, 295, 406, 1596, 1192, 1622, 11, 457, 1238, 8151, 665, 50684], "temperature": 0.0, "avg_logprob": -0.1502040227254232, "compression_ratio": 1.6379310344827587, "no_speech_prob": 0.0021816566586494446}, {"id": 579, "seek": 281260, "start": 2819.0, "end": 2823.88, "text": " chip that once was export control is now free to flow to China. So there's a, you know, some", "tokens": [50684, 11409, 300, 1564, 390, 10725, 1969, 307, 586, 1737, 281, 3095, 281, 3533, 13, 407, 456, 311, 257, 11, 291, 458, 11, 512, 50928], "temperature": 0.0, "avg_logprob": -0.1502040227254232, "compression_ratio": 1.6379310344827587, "no_speech_prob": 0.0021816566586494446}, {"id": 580, "seek": 281260, "start": 2823.88, "end": 2828.6, "text": " more significant room for Nvidia to grow there, especially given that that's going to be competing", "tokens": [50928, 544, 4776, 1808, 337, 46284, 281, 1852, 456, 11, 2318, 2212, 300, 300, 311, 516, 281, 312, 15439, 51164], "temperature": 0.0, "avg_logprob": -0.1502040227254232, "compression_ratio": 1.6379310344827587, "no_speech_prob": 0.0021816566586494446}, {"id": 581, "seek": 281260, "start": 2828.6, "end": 2834.7599999999998, "text": " with a less on paper capable chip, which is the Ascend 910C. So you think about, you know, the", "tokens": [51164, 365, 257, 1570, 322, 3035, 8189, 11409, 11, 597, 307, 264, 1018, 21153, 1722, 3279, 34, 13, 407, 291, 519, 466, 11, 291, 458, 11, 264, 51472], "temperature": 0.0, "avg_logprob": -0.1502040227254232, "compression_ratio": 1.6379310344827587, "no_speech_prob": 0.0021816566586494446}, {"id": 582, "seek": 281260, "start": 2834.7599999999998, "end": 2840.6, "text": " battle in China right now, it's largely between the Nvidia H200 and the B30 that's going to be", "tokens": [51472, 4635, 294, 3533, 558, 586, 11, 309, 311, 11611, 1296, 264, 46284, 389, 7629, 293, 264, 363, 3446, 300, 311, 516, 281, 312, 51764], "temperature": 0.0, "avg_logprob": -0.1502040227254232, "compression_ratio": 1.6379310344827587, "no_speech_prob": 0.0021816566586494446}, {"id": 583, "seek": 284060, "start": 2840.6, "end": 2846.2799999999997, "text": " coming out soon. And then the Ascend 910C or current Huawei flagship at 10 910C, by the way,", "tokens": [50364, 1348, 484, 2321, 13, 400, 550, 264, 1018, 21153, 1722, 3279, 34, 420, 2190, 28542, 30400, 412, 1266, 1722, 3279, 34, 11, 538, 264, 636, 11, 50648], "temperature": 0.0, "avg_logprob": -0.17960886316975272, "compression_ratio": 1.6, "no_speech_prob": 0.0009386790916323662}, {"id": 584, "seek": 284060, "start": 2846.2799999999997, "end": 2853.24, "text": " is stuck on the SMIC 7 nanometer process, whereas the H200 is looking at like more like a,", "tokens": [50648, 307, 5541, 322, 264, 13115, 2532, 1614, 14067, 13606, 1399, 11, 9735, 264, 389, 7629, 307, 1237, 412, 411, 544, 411, 257, 11, 50996], "temperature": 0.0, "avg_logprob": -0.17960886316975272, "compression_ratio": 1.6, "no_speech_prob": 0.0009386790916323662}, {"id": 585, "seek": 284060, "start": 2853.24, "end": 2857.4, "text": " I guess a five or four nanometer process. It's a more advanced node that comes out from TSMC. So", "tokens": [50996, 286, 2041, 257, 1732, 420, 1451, 14067, 13606, 1399, 13, 467, 311, 257, 544, 7339, 9984, 300, 1487, 484, 490, 314, 26693, 34, 13, 407, 51204], "temperature": 0.0, "avg_logprob": -0.17960886316975272, "compression_ratio": 1.6, "no_speech_prob": 0.0009386790916323662}, {"id": 586, "seek": 284060, "start": 2857.4, "end": 2862.12, "text": " we're already seeing the actual chip fab stealing kind of really have an effect here.", "tokens": [51204, 321, 434, 1217, 2577, 264, 3539, 11409, 5355, 19757, 733, 295, 534, 362, 364, 1802, 510, 13, 51440], "temperature": 0.0, "avg_logprob": -0.17960886316975272, "compression_ratio": 1.6, "no_speech_prob": 0.0009386790916323662}, {"id": 587, "seek": 284060, "start": 2862.12, "end": 2867.16, "text": " They're all kinds of interesting comparisons that you can make, you know, 910C versus H20. That's", "tokens": [51440, 814, 434, 439, 3685, 295, 1880, 33157, 300, 291, 393, 652, 11, 291, 458, 11, 1722, 3279, 34, 5717, 389, 2009, 13, 663, 311, 51692], "temperature": 0.0, "avg_logprob": -0.17960886316975272, "compression_ratio": 1.6, "no_speech_prob": 0.0009386790916323662}, {"id": 588, "seek": 286716, "start": 2867.3199999999997, "end": 2873.24, "text": " actually quite relevant as well. It's not terribly surprising. I mean, you just have this, this issue", "tokens": [50372, 767, 1596, 7340, 382, 731, 13, 467, 311, 406, 22903, 8830, 13, 286, 914, 11, 291, 445, 362, 341, 11, 341, 2734, 50668], "temperature": 0.0, "avg_logprob": -0.1158820330086401, "compression_ratio": 1.6724738675958188, "no_speech_prob": 0.00096962257521227}, {"id": 589, "seek": 286716, "start": 2873.24, "end": 2879.3199999999997, "text": " with like capacity and the ability to compete in a market where you're being blocked from,", "tokens": [50668, 365, 411, 6042, 293, 264, 3485, 281, 11831, 294, 257, 2142, 689, 291, 434, 885, 15470, 490, 11, 50972], "temperature": 0.0, "avg_logprob": -0.1158820330086401, "compression_ratio": 1.6724738675958188, "no_speech_prob": 0.00096962257521227}, {"id": 590, "seek": 286716, "start": 2879.3199999999997, "end": 2883.7999999999997, "text": " from actually doing this. So yeah, expect more of this, expect Nvidia's market share to a road.", "tokens": [50972, 490, 767, 884, 341, 13, 407, 1338, 11, 2066, 544, 295, 341, 11, 2066, 46284, 311, 2142, 2073, 281, 257, 3060, 13, 51196], "temperature": 0.0, "avg_logprob": -0.1158820330086401, "compression_ratio": 1.6724738675958188, "no_speech_prob": 0.00096962257521227}, {"id": 591, "seek": 286716, "start": 2884.52, "end": 2888.52, "text": " That's not a bad thing in and of itself. The question is, what's your goal? Is your goal for", "tokens": [51232, 663, 311, 406, 257, 1578, 551, 294, 293, 295, 2564, 13, 440, 1168, 307, 11, 437, 311, 428, 3387, 30, 1119, 428, 3387, 337, 51432], "temperature": 0.0, "avg_logprob": -0.1158820330086401, "compression_ratio": 1.6724738675958188, "no_speech_prob": 0.00096962257521227}, {"id": 592, "seek": 286716, "start": 2888.52, "end": 2893.8799999999997, "text": " Nvidia to maximize its market cap? Where is your goal for America to retain an AI advantage? Those", "tokens": [51432, 46284, 281, 19874, 1080, 2142, 1410, 30, 2305, 307, 428, 3387, 337, 3374, 281, 18340, 364, 7318, 5002, 30, 3950, 51700], "temperature": 0.0, "avg_logprob": -0.1158820330086401, "compression_ratio": 1.6724738675958188, "no_speech_prob": 0.00096962257521227}, {"id": 593, "seek": 289388, "start": 2893.88, "end": 2899.6400000000003, "text": " two things cannot co-exist in the same universe. So you got to pick one and, you know, we'll see", "tokens": [50364, 732, 721, 2644, 598, 12, 18217, 294, 264, 912, 6445, 13, 407, 291, 658, 281, 1888, 472, 293, 11, 291, 458, 11, 321, 603, 536, 50652], "temperature": 0.0, "avg_logprob": -0.21748867440730968, "compression_ratio": 1.5296610169491525, "no_speech_prob": 0.003823481732979417}, {"id": 594, "seek": 289388, "start": 2899.6400000000003, "end": 2904.76, "text": " which one the Trump administration is picking one one. Next story on OpenAI,", "tokens": [50652, 597, 472, 264, 3899, 7236, 307, 8867, 472, 472, 13, 3087, 1657, 322, 7238, 48698, 11, 50908], "temperature": 0.0, "avg_logprob": -0.21748867440730968, "compression_ratio": 1.5296610169491525, "no_speech_prob": 0.003823481732979417}, {"id": 595, "seek": 289388, "start": 2904.76, "end": 2914.04, "text": " Southbank has secured a $40 billion loan to boost OpenAI investments. So this is a 12 month", "tokens": [50908, 4242, 25423, 575, 22905, 257, 1848, 5254, 5218, 10529, 281, 9194, 7238, 48698, 13784, 13, 407, 341, 307, 257, 2272, 1618, 51372], "temperature": 0.0, "avg_logprob": -0.21748867440730968, "compression_ratio": 1.5296610169491525, "no_speech_prob": 0.003823481732979417}, {"id": 596, "seek": 289388, "start": 2914.04, "end": 2920.84, "text": " term that is going to help cover Southbank's $30 billion commitment to OpenAI, which is part of", "tokens": [51372, 1433, 300, 307, 516, 281, 854, 2060, 4242, 25423, 311, 1848, 3446, 5218, 8371, 281, 7238, 48698, 11, 597, 307, 644, 295, 51712], "temperature": 0.0, "avg_logprob": -0.21748867440730968, "compression_ratio": 1.5296610169491525, "no_speech_prob": 0.003823481732979417}, {"id": 597, "seek": 292084, "start": 2920.92, "end": 2932.04, "text": " recently closed 110, 120 billion of last track around for OpenAI. It could be an indication of OpenAI", "tokens": [50368, 3938, 5395, 20154, 11, 10411, 5218, 295, 1036, 2837, 926, 337, 7238, 48698, 13, 467, 727, 312, 364, 18877, 295, 7238, 48698, 50924], "temperature": 0.0, "avg_logprob": -0.22050162156422934, "compression_ratio": 1.5185185185185186, "no_speech_prob": 0.003118807915598154}, {"id": 598, "seek": 292084, "start": 2932.04, "end": 2938.04, "text": " really aggressively striving to IPO so that reinvestment for Southbank pays off.", "tokens": [50924, 534, 32024, 36582, 281, 50220, 370, 300, 6561, 5571, 518, 337, 4242, 25423, 10604, 766, 13, 51224], "temperature": 0.0, "avg_logprob": -0.22050162156422934, "compression_ratio": 1.5185185185185186, "no_speech_prob": 0.003118807915598154}, {"id": 599, "seek": 292084, "start": 2939.08, "end": 2944.36, "text": " Yeah, so this is being lent to Southbank by a whole bunch of banks, you know, Goldman Sachs,", "tokens": [51276, 865, 11, 370, 341, 307, 885, 23556, 281, 4242, 25423, 538, 257, 1379, 3840, 295, 10237, 11, 291, 458, 11, 45378, 25626, 82, 11, 51540], "temperature": 0.0, "avg_logprob": -0.22050162156422934, "compression_ratio": 1.5185185185185186, "no_speech_prob": 0.003118807915598154}, {"id": 600, "seek": 292084, "start": 2944.36, "end": 2950.04, "text": " JP Morgan, a whole bunch of Japanese banks. I didn't know about Mitsuho Bank. Anyway, a whole", "tokens": [51540, 34336, 16724, 11, 257, 1379, 3840, 295, 5433, 10237, 13, 286, 994, 380, 458, 466, 376, 35711, 1289, 8915, 13, 5684, 11, 257, 1379, 51824], "temperature": 0.0, "avg_logprob": -0.22050162156422934, "compression_ratio": 1.5185185185185186, "no_speech_prob": 0.003118807915598154}, {"id": 601, "seek": 295004, "start": 2950.04, "end": 2954.6, "text": " bunch of others. So first of all, this is the largest loan that Southbank has ever borrowed", "tokens": [50364, 3840, 295, 2357, 13, 407, 700, 295, 439, 11, 341, 307, 264, 6443, 10529, 300, 4242, 25423, 575, 1562, 26805, 50592], "temperature": 0.0, "avg_logprob": -0.10935397480809411, "compression_ratio": 1.6224489795918366, "no_speech_prob": 0.002082540886476636}, {"id": 602, "seek": 295004, "start": 2954.6, "end": 2961.24, "text": " that's denominated entirely in dollars. The loan itself is unsecured. It has a 12 month term,", "tokens": [50592, 300, 311, 1441, 298, 5410, 7696, 294, 3808, 13, 440, 10529, 2564, 307, 517, 8159, 3831, 13, 467, 575, 257, 2272, 1618, 1433, 11, 50924], "temperature": 0.0, "avg_logprob": -0.10935397480809411, "compression_ratio": 1.6224489795918366, "no_speech_prob": 0.002082540886476636}, {"id": 603, "seek": 295004, "start": 2961.24, "end": 2967.8, "text": " and that means it has to be repaid or refinanced within a year. And that's weird for such a big", "tokens": [50924, 293, 300, 1355, 309, 575, 281, 312, 1085, 17810, 420, 44395, 4864, 1951, 257, 1064, 13, 400, 300, 311, 3657, 337, 1270, 257, 955, 51252], "temperature": 0.0, "avg_logprob": -0.10935397480809411, "compression_ratio": 1.6224489795918366, "no_speech_prob": 0.002082540886476636}, {"id": 604, "seek": 295004, "start": 2967.8, "end": 2972.68, "text": " amount of money, right? Normally you'd expect a kind of long-term loan for long-term investment.", "tokens": [51252, 2372, 295, 1460, 11, 558, 30, 17424, 291, 1116, 2066, 257, 733, 295, 938, 12, 7039, 10529, 337, 938, 12, 7039, 6078, 13, 51496], "temperature": 0.0, "avg_logprob": -0.10935397480809411, "compression_ratio": 1.6224489795918366, "no_speech_prob": 0.002082540886476636}, {"id": 605, "seek": 295004, "start": 2972.68, "end": 2978.2, "text": " And so the question is, why is it so short-term? Basically, as you said, this is a big signal that", "tokens": [51496, 400, 370, 264, 1168, 307, 11, 983, 307, 309, 370, 2099, 12, 7039, 30, 8537, 11, 382, 291, 848, 11, 341, 307, 257, 955, 6358, 300, 51772], "temperature": 0.0, "avg_logprob": -0.10935397480809411, "compression_ratio": 1.6224489795918366, "no_speech_prob": 0.002082540886476636}, {"id": 606, "seek": 297820, "start": 2978.2, "end": 2983.0, "text": " this is about an OpenAI IPO, right? They expect in the next 12 months, at least as a telegraphing", "tokens": [50364, 341, 307, 466, 364, 7238, 48698, 50220, 11, 558, 30, 814, 2066, 294, 264, 958, 2272, 2493, 11, 412, 1935, 382, 257, 4304, 34091, 278, 50604], "temperature": 0.0, "avg_logprob": -0.19232437427227314, "compression_ratio": 1.6220735785953178, "no_speech_prob": 0.0009691856103017926}, {"id": 607, "seek": 297820, "start": 2983.0, "end": 2988.6, "text": " that they expect that they're going to have liquidity come in through an IPO that's going to allow", "tokens": [50604, 300, 436, 2066, 300, 436, 434, 516, 281, 362, 33131, 808, 294, 807, 364, 50220, 300, 311, 516, 281, 2089, 50884], "temperature": 0.0, "avg_logprob": -0.19232437427227314, "compression_ratio": 1.6220735785953178, "no_speech_prob": 0.0009691856103017926}, {"id": 608, "seek": 297820, "start": 2988.6, "end": 2994.6, "text": " then Southbank to pay back on those loans. And so that's maybe not surprising. And obviously,", "tokens": [50884, 550, 4242, 25423, 281, 1689, 646, 322, 729, 15443, 13, 400, 370, 300, 311, 1310, 406, 8830, 13, 400, 2745, 11, 51184], "temperature": 0.0, "avg_logprob": -0.19232437427227314, "compression_ratio": 1.6220735785953178, "no_speech_prob": 0.0009691856103017926}, {"id": 609, "seek": 297820, "start": 2994.6, "end": 2999.16, "text": " there's $20 billion and you'll run rate right now that OpenAI has that's right on track. They've", "tokens": [51184, 456, 311, 1848, 2009, 5218, 293, 291, 603, 1190, 3314, 558, 586, 300, 7238, 48698, 575, 300, 311, 558, 322, 2837, 13, 814, 600, 51412], "temperature": 0.0, "avg_logprob": -0.19232437427227314, "compression_ratio": 1.6220735785953178, "no_speech_prob": 0.0009691856103017926}, {"id": 610, "seek": 297820, "start": 2999.16, "end": 3006.3599999999997, "text": " message 2027 or late 2026 as the IPO time forizons. So, you know, not a huge shock in that sense.", "tokens": [51412, 3636, 945, 10076, 420, 3469, 945, 10880, 382, 264, 50220, 565, 337, 590, 892, 13, 407, 11, 291, 458, 11, 406, 257, 2603, 5588, 294, 300, 2020, 13, 51772], "temperature": 0.0, "avg_logprob": -0.19232437427227314, "compression_ratio": 1.6220735785953178, "no_speech_prob": 0.0009691856103017926}, {"id": 611, "seek": 300636, "start": 3006.36, "end": 3011.4, "text": " But it is a big bet. It's yet another big bet by Softbank on OpenAI. I'm sure,", "tokens": [50364, 583, 309, 307, 257, 955, 778, 13, 467, 311, 1939, 1071, 955, 778, 538, 16985, 25423, 322, 7238, 48698, 13, 286, 478, 988, 11, 50616], "temperature": 0.0, "avg_logprob": -0.14802916687313872, "compression_ratio": 1.5122950819672132, "no_speech_prob": 0.003536602482199669}, {"id": 612, "seek": 300636, "start": 3011.4, "end": 3014.92, "text": " remember if it was this article or somewhere else that I read, I think Softbank has something", "tokens": [50616, 1604, 498, 309, 390, 341, 7222, 420, 4079, 1646, 300, 286, 1401, 11, 286, 519, 16985, 25423, 575, 746, 50792], "temperature": 0.0, "avg_logprob": -0.14802916687313872, "compression_ratio": 1.5122950819672132, "no_speech_prob": 0.003536602482199669}, {"id": 613, "seek": 300636, "start": 3014.92, "end": 3023.08, "text": " like a 1.5X multiple on their OpenAI investment so far, which seems pretty low to me, but I mean,", "tokens": [50792, 411, 257, 502, 13, 20, 55, 3866, 322, 641, 7238, 48698, 6078, 370, 1400, 11, 597, 2544, 1238, 2295, 281, 385, 11, 457, 286, 914, 11, 51200], "temperature": 0.0, "avg_logprob": -0.14802916687313872, "compression_ratio": 1.5122950819672132, "no_speech_prob": 0.003536602482199669}, {"id": 614, "seek": 300636, "start": 3023.08, "end": 3029.8, "text": " yeah, we'll see what the valuation looks like going forward. Next story of funding, we haven't had", "tokens": [51200, 1338, 11, 321, 603, 536, 437, 264, 38546, 1542, 411, 516, 2128, 13, 3087, 1657, 295, 6137, 11, 321, 2378, 380, 632, 51536], "temperature": 0.0, "avg_logprob": -0.14802916687313872, "compression_ratio": 1.5122950819672132, "no_speech_prob": 0.003536602482199669}, {"id": 615, "seek": 302980, "start": 3029.8, "end": 3037.1600000000003, "text": " a billion dollar valuation this episode yet. So, Granola has raised $125 million in their", "tokens": [50364, 257, 5218, 7241, 38546, 341, 3500, 1939, 13, 407, 11, 23554, 4711, 575, 6005, 1848, 48804, 2459, 294, 641, 50732], "temperature": 0.0, "avg_logprob": -0.17315014203389487, "compression_ratio": 1.5062240663900415, "no_speech_prob": 0.048051267862319946}, {"id": 616, "seek": 302980, "start": 3037.1600000000003, "end": 3044.6000000000004, "text": " CVC round and now have a valuation of 1.5 billion. Granola is perhaps the market leader in AI", "tokens": [50732, 22995, 34, 3098, 293, 586, 362, 257, 38546, 295, 502, 13, 20, 5218, 13, 23554, 4711, 307, 4317, 264, 2142, 5263, 294, 7318, 51104], "temperature": 0.0, "avg_logprob": -0.17315014203389487, "compression_ratio": 1.5062240663900415, "no_speech_prob": 0.048051267862319946}, {"id": 617, "seek": 302980, "start": 3044.6000000000004, "end": 3049.96, "text": " note-taking that I'm aware of. You launch it as you have a meeting, it listens in and takes", "tokens": [51104, 3637, 12, 48625, 300, 286, 478, 3650, 295, 13, 509, 4025, 309, 382, 291, 362, 257, 3440, 11, 309, 35959, 294, 293, 2516, 51372], "temperature": 0.0, "avg_logprob": -0.17315014203389487, "compression_ratio": 1.5062240663900415, "no_speech_prob": 0.048051267862319946}, {"id": 618, "seek": 302980, "start": 3049.96, "end": 3056.36, "text": " notes and prescribes. Apparently, that revenue has grown by 250% over this quarter. So,", "tokens": [51372, 5570, 293, 1183, 1142, 6446, 13, 16755, 11, 300, 9324, 575, 7709, 538, 11650, 4, 670, 341, 6555, 13, 407, 11, 51692], "temperature": 0.0, "avg_logprob": -0.17315014203389487, "compression_ratio": 1.5062240663900415, "no_speech_prob": 0.048051267862319946}, {"id": 619, "seek": 305636, "start": 3057.0, "end": 3062.6, "text": " if you're in a business world, clearly AI note-taking is a massive, massive market and so far,", "tokens": [50396, 498, 291, 434, 294, 257, 1606, 1002, 11, 4448, 7318, 3637, 12, 48625, 307, 257, 5994, 11, 5994, 2142, 293, 370, 1400, 11, 50676], "temperature": 0.0, "avg_logprob": -0.21422523718613845, "compression_ratio": 1.5897435897435896, "no_speech_prob": 0.0377371571958065}, {"id": 620, "seek": 305636, "start": 3062.6, "end": 3070.6, "text": " Granola appears to be poised to perhaps take lead. We get so bored of these 3X, 3 months,", "tokens": [50676, 23554, 4711, 7038, 281, 312, 714, 2640, 281, 4317, 747, 1477, 13, 492, 483, 370, 13521, 295, 613, 805, 55, 11, 805, 2493, 11, 51076], "temperature": 0.0, "avg_logprob": -0.21422523718613845, "compression_ratio": 1.5897435897435896, "no_speech_prob": 0.0377371571958065}, {"id": 621, "seek": 305636, "start": 3070.6, "end": 3076.6, "text": " right before you run rate increases. I mean, come on, AI note-taking, that's not exciting, but", "tokens": [51076, 558, 949, 291, 1190, 3314, 8637, 13, 286, 914, 11, 808, 322, 11, 7318, 3637, 12, 48625, 11, 300, 311, 406, 4670, 11, 457, 51376], "temperature": 0.0, "avg_logprob": -0.21422523718613845, "compression_ratio": 1.5897435897435896, "no_speech_prob": 0.0377371571958065}, {"id": 622, "seek": 305636, "start": 3076.6, "end": 3082.28, "text": " it's a big deal, you know, that's where you print the money. And speaking of business deals,", "tokens": [51376, 309, 311, 257, 955, 2028, 11, 291, 458, 11, 300, 311, 689, 291, 4482, 264, 1460, 13, 400, 4124, 295, 1606, 11215, 11, 51660], "temperature": 0.0, "avg_logprob": -0.21422523718613845, "compression_ratio": 1.5897435897435896, "no_speech_prob": 0.0377371571958065}, {"id": 623, "seek": 308228, "start": 3082.36, "end": 3089.0800000000004, "text": " next up, Unphropic is acquiring Stealth Startup coefficient bio in a $400 million deal.", "tokens": [50368, 958, 493, 11, 1156, 950, 39173, 307, 37374, 3592, 1302, 6481, 1010, 17619, 12198, 294, 257, 1848, 13741, 2459, 2028, 13, 50704], "temperature": 0.0, "avg_logprob": -0.18191118673844772, "compression_ratio": 1.5060728744939271, "no_speech_prob": 0.016372378915548325}, {"id": 624, "seek": 308228, "start": 3089.7200000000003, "end": 3097.0, "text": " This is a pretty small young startup only founded eight months ago, had fewer than 10 employees,", "tokens": [50736, 639, 307, 257, 1238, 1359, 2037, 18578, 787, 13234, 3180, 2493, 2057, 11, 632, 13366, 813, 1266, 6619, 11, 51100], "temperature": 0.0, "avg_logprob": -0.18191118673844772, "compression_ratio": 1.5060728744939271, "no_speech_prob": 0.016372378915548325}, {"id": 625, "seek": 308228, "start": 3097.0, "end": 3103.7200000000003, "text": " almost all of them from computational biology research backgrounds. So, interesting, I wasn't", "tokens": [51100, 1920, 439, 295, 552, 490, 28270, 14956, 2132, 17336, 13, 407, 11, 1880, 11, 286, 2067, 380, 51436], "temperature": 0.0, "avg_logprob": -0.18191118673844772, "compression_ratio": 1.5060728744939271, "no_speech_prob": 0.016372378915548325}, {"id": 626, "seek": 308228, "start": 3103.7200000000003, "end": 3110.84, "text": " even aware that Unphropic has a healthcare life sciences team, but it does, and it looks like", "tokens": [51436, 754, 3650, 300, 1156, 950, 39173, 575, 257, 8884, 993, 17677, 1469, 11, 457, 309, 775, 11, 293, 309, 1542, 411, 51792], "temperature": 0.0, "avg_logprob": -0.18191118673844772, "compression_ratio": 1.5060728744939271, "no_speech_prob": 0.016372378915548325}, {"id": 627, "seek": 311084, "start": 3110.84, "end": 3116.6800000000003, "text": " Unphropic is acquiring more people to join that team. Yeah, I mean, Dario comes from a, I think,", "tokens": [50364, 1156, 950, 39173, 307, 37374, 544, 561, 281, 3917, 300, 1469, 13, 865, 11, 286, 914, 11, 413, 4912, 1487, 490, 257, 11, 286, 519, 11, 50656], "temperature": 0.0, "avg_logprob": -0.1837446055281053, "compression_ratio": 1.583673469387755, "no_speech_prob": 0.0009109246893785894}, {"id": 628, "seek": 311084, "start": 3116.6800000000003, "end": 3123.96, "text": " biophysics background, right, or biochemistry background, but yeah, I mean, look, $400 million is a lot", "tokens": [50656, 3228, 5317, 41732, 3678, 11, 558, 11, 420, 12198, 48353, 3678, 11, 457, 1338, 11, 286, 914, 11, 574, 11, 1848, 13741, 2459, 307, 257, 688, 51020], "temperature": 0.0, "avg_logprob": -0.1837446055281053, "compression_ratio": 1.583673469387755, "no_speech_prob": 0.0009109246893785894}, {"id": 629, "seek": 311084, "start": 3123.96, "end": 3130.76, "text": " for nine people. So, that's quite a big thing, but it definitely does imply that there's this,", "tokens": [51020, 337, 4949, 561, 13, 407, 11, 300, 311, 1596, 257, 955, 551, 11, 457, 309, 2138, 775, 33616, 300, 456, 311, 341, 11, 51360], "temperature": 0.0, "avg_logprob": -0.1837446055281053, "compression_ratio": 1.583673469387755, "no_speech_prob": 0.0009109246893785894}, {"id": 630, "seek": 311084, "start": 3130.76, "end": 3136.36, "text": " you know, big shift in emphasis or kind of doubling down on the biotech angle. Yeah, I mean,", "tokens": [51360, 291, 458, 11, 955, 5513, 294, 16271, 420, 733, 295, 33651, 760, 322, 264, 3228, 1370, 339, 5802, 13, 865, 11, 286, 914, 11, 51640], "temperature": 0.0, "avg_logprob": -0.1837446055281053, "compression_ratio": 1.583673469387755, "no_speech_prob": 0.0009109246893785894}, {"id": 631, "seek": 313636, "start": 3136.44, "end": 3142.84, "text": " the VC math, by the way, for this is like ridiculously good. So, there's like this New York-based VC", "tokens": [50368, 264, 41922, 5221, 11, 538, 264, 636, 11, 337, 341, 307, 411, 41358, 665, 13, 407, 11, 456, 311, 411, 341, 1873, 3609, 12, 6032, 41922, 50688], "temperature": 0.0, "avg_logprob": -0.20079978438448315, "compression_ratio": 1.5506756756756757, "no_speech_prob": 0.0057281190529465675}, {"id": 632, "seek": 313636, "start": 3142.84, "end": 3147.56, "text": " firm called Dimension that owned like half the company. And so, they're going to make", "tokens": [50688, 6174, 1219, 20975, 3378, 300, 11684, 411, 1922, 264, 2237, 13, 400, 370, 11, 436, 434, 516, 281, 652, 50924], "temperature": 0.0, "avg_logprob": -0.20079978438448315, "compression_ratio": 1.5506756756756757, "no_speech_prob": 0.0057281190529465675}, {"id": 633, "seek": 313636, "start": 3148.1200000000003, "end": 3154.44, "text": " it's actually 40,000 percent IRR on the investments. That's pretty decent. And that's just pretty", "tokens": [50952, 309, 311, 767, 3356, 11, 1360, 3043, 16486, 49, 322, 264, 13784, 13, 663, 311, 1238, 8681, 13, 400, 300, 311, 445, 1238, 51268], "temperature": 0.0, "avg_logprob": -0.20079978438448315, "compression_ratio": 1.5506756756756757, "no_speech_prob": 0.0057281190529465675}, {"id": 634, "seek": 313636, "start": 3154.44, "end": 3159.08, "text": " wild indication of how fast AI is blazing through the bio-medical field right now. But,", "tokens": [51268, 4868, 18877, 295, 577, 2370, 7318, 307, 16379, 8781, 807, 264, 12198, 12, 1912, 804, 2519, 558, 586, 13, 583, 11, 51500], "temperature": 0.0, "avg_logprob": -0.20079978438448315, "compression_ratio": 1.5506756756756757, "no_speech_prob": 0.0057281190529465675}, {"id": 635, "seek": 313636, "start": 3159.08, "end": 3163.4, "text": " anyway, curious, I wonder if this tied as well, the concerns too, over where where the", "tokens": [51500, 4033, 11, 6369, 11, 286, 2441, 498, 341, 9601, 382, 731, 11, 264, 7389, 886, 11, 670, 689, 689, 264, 51716], "temperature": 0.0, "avg_logprob": -0.20079978438448315, "compression_ratio": 1.5506756756756757, "no_speech_prob": 0.0057281190529465675}, {"id": 636, "seek": 316340, "start": 3163.4, "end": 3168.04, "text": " biocide might go, you know, on the safety, safety dimension as well, but we'll see, especially with", "tokens": [50364, 3228, 27791, 1062, 352, 11, 291, 458, 11, 322, 264, 4514, 11, 4514, 10139, 382, 731, 11, 457, 321, 603, 536, 11, 2318, 365, 50596], "temperature": 0.0, "avg_logprob": -0.24700611712885837, "compression_ratio": 1.4923664122137406, "no_speech_prob": 0.0017262265319004655}, {"id": 637, "seek": 316340, "start": 3168.04, "end": 3176.2000000000003, "text": " Methos. Yeah, I a bit more background, proper data amounts, CLAWD for life sciences initiative", "tokens": [50596, 376, 3293, 329, 13, 865, 11, 286, 257, 857, 544, 3678, 11, 2296, 1412, 11663, 11, 383, 11435, 54, 35, 337, 993, 17677, 11552, 51004], "temperature": 0.0, "avg_logprob": -0.24700611712885837, "compression_ratio": 1.4923664122137406, "no_speech_prob": 0.0017262265319004655}, {"id": 638, "seek": 316340, "start": 3176.2000000000003, "end": 3183.0, "text": " back in October of 2025, earlier this year, just in January, they launched CLAWD for health care,", "tokens": [51004, 646, 294, 7617, 295, 39209, 11, 3071, 341, 1064, 11, 445, 294, 7061, 11, 436, 8730, 383, 11435, 54, 35, 337, 1585, 1127, 11, 51344], "temperature": 0.0, "avg_logprob": -0.24700611712885837, "compression_ratio": 1.4923664122137406, "no_speech_prob": 0.0017262265319004655}, {"id": 639, "seek": 316340, "start": 3183.0, "end": 3189.32, "text": " which is more for healthcare providers. So, you could read this Iver as going deeper into research", "tokens": [51344, 597, 307, 544, 337, 8884, 11330, 13, 407, 11, 291, 727, 1401, 341, 286, 331, 382, 516, 7731, 666, 2132, 51660], "temperature": 0.0, "avg_logprob": -0.24700611712885837, "compression_ratio": 1.4923664122137406, "no_speech_prob": 0.0017262265319004655}, {"id": 640, "seek": 318932, "start": 3190.2000000000003, "end": 3196.04, "text": " on, you know, the biocide or as them angling from the healthcare market, which presumably is a", "tokens": [50408, 322, 11, 291, 458, 11, 264, 3228, 27791, 420, 382, 552, 2562, 1688, 490, 264, 8884, 2142, 11, 597, 26742, 307, 257, 50700], "temperature": 0.0, "avg_logprob": -0.2818771563078228, "compression_ratio": 1.4641509433962263, "no_speech_prob": 0.014030794613063335}, {"id": 641, "seek": 318932, "start": 3196.04, "end": 3201.56, "text": " very, very big lucrative opportunity if they can actually be hip-hop applied and all these kind of", "tokens": [50700, 588, 11, 588, 955, 21296, 30457, 2650, 498, 436, 393, 767, 312, 8103, 12, 9050, 6456, 293, 439, 613, 733, 295, 50976], "temperature": 0.0, "avg_logprob": -0.2818771563078228, "compression_ratio": 1.4641509433962263, "no_speech_prob": 0.014030794613063335}, {"id": 642, "seek": 318932, "start": 3201.56, "end": 3208.28, "text": " considerations. Last story. And this is really just an odd one I wanted to throw in because it's a", "tokens": [50976, 24070, 13, 5264, 1657, 13, 400, 341, 307, 534, 445, 364, 7401, 472, 286, 1415, 281, 3507, 294, 570, 309, 311, 257, 51312], "temperature": 0.0, "avg_logprob": -0.2818771563078228, "compression_ratio": 1.4641509433962263, "no_speech_prob": 0.014030794613063335}, {"id": 643, "seek": 318932, "start": 3208.28, "end": 3215.0800000000004, "text": " bizarre business development. Opening AI has acquired the TBPN, the Budley Founder-led business", "tokens": [51312, 18265, 1606, 3250, 13, 41137, 7318, 575, 17554, 264, 29711, 15466, 11, 264, 6384, 3420, 8207, 260, 12, 1493, 1606, 51652], "temperature": 0.0, "avg_logprob": -0.2818771563078228, "compression_ratio": 1.4641509433962263, "no_speech_prob": 0.014030794613063335}, {"id": 644, "seek": 321508, "start": 3215.56, "end": 3220.2799999999997, "text": " talk show. So, if you are Twitter and you're in the AI world, the tech world, you may have seen", "tokens": [50388, 751, 855, 13, 407, 11, 498, 291, 366, 5794, 293, 291, 434, 294, 264, 7318, 1002, 11, 264, 7553, 1002, 11, 291, 815, 362, 1612, 50624], "temperature": 0.0, "avg_logprob": -0.27481103450693983, "compression_ratio": 1.6103896103896105, "no_speech_prob": 0.0355292372405529}, {"id": 645, "seek": 321508, "start": 3220.2799999999997, "end": 3227.16, "text": " the technology business programming network, which is a daily, free hour live talk show, where they", "tokens": [50624, 264, 2899, 1606, 9410, 3209, 11, 597, 307, 257, 5212, 11, 1737, 1773, 1621, 751, 855, 11, 689, 436, 50968], "temperature": 0.0, "avg_logprob": -0.27481103450693983, "compression_ratio": 1.6103896103896105, "no_speech_prob": 0.0355292372405529}, {"id": 646, "seek": 321508, "start": 3227.16, "end": 3234.36, "text": " have a lot of tech leaders and a lot of like a little bit of an antics vibe, discussion, news.", "tokens": [50968, 362, 257, 688, 295, 7553, 3523, 293, 257, 688, 295, 411, 257, 707, 857, 295, 364, 2511, 1167, 14606, 11, 5017, 11, 2583, 13, 51328], "temperature": 0.0, "avg_logprob": -0.27481103450693983, "compression_ratio": 1.6103896103896105, "no_speech_prob": 0.0355292372405529}, {"id": 647, "seek": 321508, "start": 3235.4, "end": 3240.68, "text": " Opening I acquired them, acquired like a podcast essentially. I don't understand.", "tokens": [51380, 41137, 286, 17554, 552, 11, 17554, 411, 257, 7367, 4476, 13, 286, 500, 380, 1223, 13, 51644], "temperature": 0.0, "avg_logprob": -0.27481103450693983, "compression_ratio": 1.6103896103896105, "no_speech_prob": 0.0355292372405529}, {"id": 648, "seek": 324068, "start": 3240.7599999999998, "end": 3245.96, "text": " Million, right? I think my understanding was it was like an eight-figure acquisition.", "tokens": [50368, 7190, 313, 11, 558, 30, 286, 519, 452, 3701, 390, 309, 390, 411, 364, 3180, 12, 20646, 540, 21668, 13, 50628], "temperature": 0.0, "avg_logprob": -0.2779062727223272, "compression_ratio": 1.5294117647058822, "no_speech_prob": 0.018517469987273216}, {"id": 649, "seek": 324068, "start": 3246.52, "end": 3253.7999999999997, "text": " Yeah, I don't actually know the numbers in this new story, but yeah, obviously people are like,", "tokens": [50656, 865, 11, 286, 500, 380, 767, 458, 264, 3547, 294, 341, 777, 1657, 11, 457, 1338, 11, 2745, 561, 366, 411, 11, 51020], "temperature": 0.0, "avg_logprob": -0.2779062727223272, "compression_ratio": 1.5294117647058822, "no_speech_prob": 0.018517469987273216}, {"id": 650, "seek": 324068, "start": 3253.7999999999997, "end": 3260.04, "text": " well, so much for them covering opening AI, fair-ging, or objectively, they were like,", "tokens": [51020, 731, 11, 370, 709, 337, 552, 10322, 5193, 7318, 11, 3143, 12, 3249, 11, 420, 46067, 11, 436, 645, 411, 11, 51332], "temperature": 0.0, "avg_logprob": -0.2779062727223272, "compression_ratio": 1.5294117647058822, "no_speech_prob": 0.018517469987273216}, {"id": 651, "seek": 324068, "start": 3260.04, "end": 3266.52, "text": " oh, our editorial independence will remain, you know, whatever, obviously no one believes that.", "tokens": [51332, 1954, 11, 527, 33412, 14640, 486, 6222, 11, 291, 458, 11, 2035, 11, 2745, 572, 472, 12307, 300, 13, 51656], "temperature": 0.0, "avg_logprob": -0.2779062727223272, "compression_ratio": 1.5294117647058822, "no_speech_prob": 0.018517469987273216}, {"id": 652, "seek": 326652, "start": 3266.6, "end": 3274.2, "text": " So, I don't know if opening AI is just like angry about all the PR, nightmares, things", "tokens": [50368, 407, 11, 286, 500, 380, 458, 498, 5193, 7318, 307, 445, 411, 6884, 466, 439, 264, 11568, 11, 36911, 11, 721, 50748], "temperature": 0.0, "avg_logprob": -0.18448230832122092, "compression_ratio": 1.6182432432432432, "no_speech_prob": 0.0013248467585071921}, {"id": 653, "seek": 326652, "start": 3274.2, "end": 3279.72, "text": " they keep getting into or what, but it's, I've seen some really bullish analysis on this too. I", "tokens": [50748, 436, 1066, 1242, 666, 420, 437, 11, 457, 309, 311, 11, 286, 600, 1612, 512, 534, 38692, 5215, 322, 341, 886, 13, 286, 51024], "temperature": 0.0, "avg_logprob": -0.18448230832122092, "compression_ratio": 1.6182432432432432, "no_speech_prob": 0.0013248467585071921}, {"id": 654, "seek": 326652, "start": 3279.72, "end": 3283.8, "text": " guess I struggle to see it a little bit just because I mean, it's really see it for TBPN, it's", "tokens": [51024, 2041, 286, 7799, 281, 536, 309, 257, 707, 857, 445, 570, 286, 914, 11, 309, 311, 534, 536, 309, 337, 29711, 15466, 11, 309, 311, 51228], "temperature": 0.0, "avg_logprob": -0.18448230832122092, "compression_ratio": 1.6182432432432432, "no_speech_prob": 0.0013248467585071921}, {"id": 655, "seek": 326652, "start": 3283.8, "end": 3289.64, "text": " just a lot of money. Okay, cool. But the challenge is if you're going to start to make acquisitions to", "tokens": [51228, 445, 257, 688, 295, 1460, 13, 1033, 11, 1627, 13, 583, 264, 3430, 307, 498, 291, 434, 516, 281, 722, 281, 652, 17883, 2451, 281, 51520], "temperature": 0.0, "avg_logprob": -0.18448230832122092, "compression_ratio": 1.6182432432432432, "no_speech_prob": 0.0013248467585071921}, {"id": 656, "seek": 326652, "start": 3289.64, "end": 3295.64, "text": " kind of turn public opinion ahead of an IPO, it's not obvious to me that TBPN is your acquisition,", "tokens": [51520, 733, 295, 1261, 1908, 4800, 2286, 295, 364, 50220, 11, 309, 311, 406, 6322, 281, 385, 300, 29711, 15466, 307, 428, 21668, 11, 51820], "temperature": 0.0, "avg_logprob": -0.18448230832122092, "compression_ratio": 1.6182432432432432, "no_speech_prob": 0.0013248467585071921}, {"id": 657, "seek": 329564, "start": 3296.12, "end": 3300.3599999999997, "text": " I'm an idiot and I'm like, by the way, I'm so far to my depth and the quality of people who will", "tokens": [50388, 286, 478, 364, 14270, 293, 286, 478, 411, 11, 538, 264, 636, 11, 286, 478, 370, 1400, 281, 452, 7161, 293, 264, 3125, 295, 561, 567, 486, 50600], "temperature": 0.0, "avg_logprob": -0.18932770164149582, "compression_ratio": 1.7987616099071206, "no_speech_prob": 0.01016752514988184}, {"id": 658, "seek": 329564, "start": 3300.3599999999997, "end": 3304.3599999999997, "text": " have waited on this acquisition, unless they just came in and kibosh the whole thing and said,", "tokens": [50600, 362, 15240, 322, 341, 21668, 11, 5969, 436, 445, 1361, 294, 293, 350, 897, 3019, 264, 1379, 551, 293, 848, 11, 50800], "temperature": 0.0, "avg_logprob": -0.18932770164149582, "compression_ratio": 1.7987616099071206, "no_speech_prob": 0.01016752514988184}, {"id": 659, "seek": 329564, "start": 3304.6, "end": 3309.08, "text": " I just really want this, which I suspect didn't happen here, but the quality of people they will", "tokens": [50812, 286, 445, 534, 528, 341, 11, 597, 286, 9091, 994, 380, 1051, 510, 11, 457, 264, 3125, 295, 561, 436, 486, 51036], "temperature": 0.0, "avg_logprob": -0.18932770164149582, "compression_ratio": 1.7987616099071206, "no_speech_prob": 0.01016752514988184}, {"id": 660, "seek": 329564, "start": 3309.08, "end": 3313.72, "text": " have had looking at this, like, Chris Lahane, like these dudes know what's up. If they did this,", "tokens": [51036, 362, 632, 1237, 412, 341, 11, 411, 11, 6688, 45862, 1929, 11, 411, 613, 27717, 458, 437, 311, 493, 13, 759, 436, 630, 341, 11, 51268], "temperature": 0.0, "avg_logprob": -0.18932770164149582, "compression_ratio": 1.7987616099071206, "no_speech_prob": 0.01016752514988184}, {"id": 661, "seek": 329564, "start": 3313.72, "end": 3318.7599999999998, "text": " they have a plan. I just don't see it. That's it. I mean, like, ultimately, these are techies,", "tokens": [51268, 436, 362, 257, 1393, 13, 286, 445, 500, 380, 536, 309, 13, 663, 311, 309, 13, 286, 914, 11, 411, 11, 6284, 11, 613, 366, 7553, 530, 11, 51520], "temperature": 0.0, "avg_logprob": -0.18932770164149582, "compression_ratio": 1.7987616099071206, "no_speech_prob": 0.01016752514988184}, {"id": 662, "seek": 329564, "start": 3318.7599999999998, "end": 3324.52, "text": " talking other techies could be a recruitment play. Ultimately, I'm not going to be putting that much", "tokens": [51520, 1417, 661, 7553, 530, 727, 312, 257, 28240, 862, 13, 23921, 11, 286, 478, 406, 516, 281, 312, 3372, 300, 709, 51808], "temperature": 0.0, "avg_logprob": -0.18932770164149582, "compression_ratio": 1.7987616099071206, "no_speech_prob": 0.01016752514988184}, {"id": 663, "seek": 332452, "start": 3324.52, "end": 3329.56, "text": " stock in like the kind of reporting that I like, why would anybody, you're an opening eye mouthpiece", "tokens": [50364, 4127, 294, 411, 264, 733, 295, 10031, 300, 286, 411, 11, 983, 576, 4472, 11, 291, 434, 364, 5193, 3313, 4525, 15281, 50616], "temperature": 0.0, "avg_logprob": -0.1937999412661693, "compression_ratio": 1.6904761904761905, "no_speech_prob": 0.004006827715784311}, {"id": 664, "seek": 332452, "start": 3329.56, "end": 3334.92, "text": " now, which is fine. But the point of the show was certainly to kind of offer a broader perspective.", "tokens": [50616, 586, 11, 597, 307, 2489, 13, 583, 264, 935, 295, 264, 855, 390, 3297, 281, 733, 295, 2626, 257, 13227, 4585, 13, 50884], "temperature": 0.0, "avg_logprob": -0.1937999412661693, "compression_ratio": 1.6904761904761905, "no_speech_prob": 0.004006827715784311}, {"id": 665, "seek": 332452, "start": 3334.92, "end": 3339.0, "text": " It's worth noting it was a positive show to begin with, right? It's not like they were ripping on", "tokens": [50884, 467, 311, 3163, 26801, 309, 390, 257, 3353, 855, 281, 1841, 365, 11, 558, 30, 467, 311, 406, 411, 436, 645, 38776, 322, 51088], "temperature": 0.0, "avg_logprob": -0.1937999412661693, "compression_ratio": 1.6904761904761905, "no_speech_prob": 0.004006827715784311}, {"id": 666, "seek": 332452, "start": 3339.0, "end": 3345.16, "text": " opening eye, pro tech broadly speaking anyway. Right. So the editorial line wouldn't even have to", "tokens": [51088, 5193, 3313, 11, 447, 7553, 19511, 4124, 4033, 13, 1779, 13, 407, 264, 33412, 1622, 2759, 380, 754, 362, 281, 51396], "temperature": 0.0, "avg_logprob": -0.1937999412661693, "compression_ratio": 1.6904761904761905, "no_speech_prob": 0.004006827715784311}, {"id": 667, "seek": 332452, "start": 3345.16, "end": 3350.84, "text": " change for Sam to not a lot. And so it's plausible that nothing will change. But if nothing changes,", "tokens": [51396, 1319, 337, 4832, 281, 406, 257, 688, 13, 400, 370, 309, 311, 39925, 300, 1825, 486, 1319, 13, 583, 498, 1825, 2962, 11, 51680], "temperature": 0.0, "avg_logprob": -0.1937999412661693, "compression_ratio": 1.6904761904761905, "no_speech_prob": 0.004006827715784311}, {"id": 668, "seek": 335084, "start": 3350.84, "end": 3354.6800000000003, "text": " then I'm wondering what's in it for opening eye of the acquisition. So anyway, there's there's", "tokens": [50364, 550, 286, 478, 6359, 437, 311, 294, 309, 337, 5193, 3313, 295, 264, 21668, 13, 407, 4033, 11, 456, 311, 456, 311, 50556], "temperature": 0.0, "avg_logprob": -0.27516592822028596, "compression_ratio": 1.4732510288065843, "no_speech_prob": 0.012423688545823097}, {"id": 669, "seek": 335084, "start": 3354.6800000000003, "end": 3361.08, "text": " got to be some quid pro quo. I just it's about my favorite. It's a weird move is my take away. Like,", "tokens": [50556, 658, 281, 312, 512, 421, 327, 447, 28425, 13, 286, 445, 309, 311, 466, 452, 2954, 13, 467, 311, 257, 3657, 1286, 307, 452, 747, 1314, 13, 1743, 11, 50876], "temperature": 0.0, "avg_logprob": -0.27516592822028596, "compression_ratio": 1.4732510288065843, "no_speech_prob": 0.012423688545823097}, {"id": 670, "seek": 335084, "start": 3361.08, "end": 3365.32, "text": " why? Who? Yes, the DPPN people benefit. Why does opening eye needless?", "tokens": [50876, 983, 30, 2102, 30, 1079, 11, 264, 413, 17755, 45, 561, 5121, 13, 1545, 775, 5193, 3313, 643, 1832, 30, 51088], "temperature": 0.0, "avg_logprob": -0.27516592822028596, "compression_ratio": 1.4732510288065843, "no_speech_prob": 0.012423688545823097}, {"id": 671, "seek": 335084, "start": 3367.48, "end": 3374.2000000000003, "text": " Onto projects and open source. We've got a couple notable advancements here. First z.ai has", "tokens": [51196, 16980, 78, 4455, 293, 1269, 4009, 13, 492, 600, 658, 257, 1916, 22556, 7295, 1117, 510, 13, 2386, 710, 13, 1301, 575, 51532], "temperature": 0.0, "avg_logprob": -0.27516592822028596, "compression_ratio": 1.4732510288065843, "no_speech_prob": 0.012423688545823097}, {"id": 672, "seek": 337420, "start": 3374.2, "end": 3384.2, "text": " released GLM 5.1 a 754 billion parameter. Make sure of experts model completely available. Open", "tokens": [50364, 4736, 16225, 44, 1025, 13, 16, 257, 9562, 19, 5218, 13075, 13, 4387, 988, 295, 8572, 2316, 2584, 2435, 13, 7238, 50864], "temperature": 0.0, "avg_logprob": -0.2767910512288411, "compression_ratio": 1.3744075829383886, "no_speech_prob": 0.022946855053305626}, {"id": 673, "seek": 337420, "start": 3384.2, "end": 3396.2799999999997, "text": " weight under the MIT license and also via their PI. And on the SB bench pro benchmark, they claim kind", "tokens": [50864, 3364, 833, 264, 13100, 10476, 293, 611, 5766, 641, 27176, 13, 400, 322, 264, 26944, 10638, 447, 18927, 11, 436, 3932, 733, 51468], "temperature": 0.0, "avg_logprob": -0.2767910512288411, "compression_ratio": 1.3744075829383886, "no_speech_prob": 0.022946855053305626}, {"id": 674, "seek": 337420, "start": 3396.2799999999997, "end": 3402.7599999999998, "text": " of very, very solid performance, perhaps even doing better than GP 5.4 and Opus 4.6 and all", "tokens": [51468, 295, 588, 11, 588, 5100, 3389, 11, 4317, 754, 884, 1101, 813, 26039, 1025, 13, 19, 293, 12011, 301, 1017, 13, 21, 293, 439, 51792], "temperature": 0.0, "avg_logprob": -0.2767910512288411, "compression_ratio": 1.3744075829383886, "no_speech_prob": 0.022946855053305626}, {"id": 675, "seek": 340276, "start": 3403.7200000000003, "end": 3411.32, "text": " leading models. So yeah, another very, very strong open source completely open weight model", "tokens": [50412, 5775, 5245, 13, 407, 1338, 11, 1071, 588, 11, 588, 2068, 1269, 4009, 2584, 1269, 3364, 2316, 50792], "temperature": 0.0, "avg_logprob": -0.1592527117047991, "compression_ratio": 1.4818652849740932, "no_speech_prob": 0.006285720970481634}, {"id": 676, "seek": 340276, "start": 3411.32, "end": 3419.7200000000003, "text": " out there. Now quite a big one at 454 billion parameters. They highlight specifically long task", "tokens": [50792, 484, 456, 13, 823, 1596, 257, 955, 472, 412, 6905, 19, 5218, 9834, 13, 814, 5078, 4682, 938, 5633, 51212], "temperature": 0.0, "avg_logprob": -0.1592527117047991, "compression_ratio": 1.4818652849740932, "no_speech_prob": 0.006285720970481634}, {"id": 677, "seek": 340276, "start": 3419.7200000000003, "end": 3426.5200000000004, "text": " execution. So they talk about being able autonomous execution for up to eight hours. And they have", "tokens": [51212, 15058, 13, 407, 436, 751, 466, 885, 1075, 23797, 15058, 337, 493, 281, 3180, 2496, 13, 400, 436, 362, 51552], "temperature": 0.0, "avg_logprob": -0.1592527117047991, "compression_ratio": 1.4818652849740932, "no_speech_prob": 0.006285720970481634}, {"id": 678, "seek": 342652, "start": 3426.52, "end": 3433.64, "text": " some demonstrations of capabilities like doing a vector database tasks to improve performance,", "tokens": [50364, 512, 34714, 295, 10862, 411, 884, 257, 8062, 8149, 9608, 281, 3470, 3389, 11, 50720], "temperature": 0.0, "avg_logprob": -0.3910143310959275, "compression_ratio": 1.6163793103448276, "no_speech_prob": 0.006092308554798365}, {"id": 679, "seek": 342652, "start": 3433.64, "end": 3440.92, "text": " optimizing critical kernel basically vibes. This is like another move towards autonomous", "tokens": [50720, 40425, 4924, 28256, 1936, 27636, 13, 639, 307, 411, 1071, 1286, 3030, 23797, 51084], "temperature": 0.0, "avg_logprob": -0.3910143310959275, "compression_ratio": 1.6163793103448276, "no_speech_prob": 0.006092308554798365}, {"id": 680, "seek": 342652, "start": 3440.92, "end": 3446.84, "text": " agent execution in line with what on froth. It hasn't been demonstrating on opening. I have been", "tokens": [51084, 9461, 15058, 294, 1622, 365, 437, 322, 431, 900, 13, 467, 6132, 380, 668, 29889, 322, 5193, 13, 286, 362, 668, 51380], "temperature": 0.0, "avg_logprob": -0.3910143310959275, "compression_ratio": 1.6163793103448276, "no_speech_prob": 0.006092308554798365}, {"id": 681, "seek": 342652, "start": 3446.84, "end": 3452.28, "text": " demonstrating with their cutting edge models. But these are fully agent things very capable of", "tokens": [51380, 29889, 365, 641, 6492, 4691, 5245, 13, 583, 613, 366, 4498, 9461, 721, 588, 8189, 295, 51652], "temperature": 0.0, "avg_logprob": -0.3910143310959275, "compression_ratio": 1.6163793103448276, "no_speech_prob": 0.006092308554798365}, {"id": 682, "seek": 345228, "start": 3452.28, "end": 3458.44, "text": " coding and very capable of achieving things fully independently without human support.", "tokens": [50364, 17720, 293, 588, 8189, 295, 19626, 721, 4498, 21761, 1553, 1952, 1406, 13, 50672], "temperature": 0.0, "avg_logprob": -0.18810754730587914, "compression_ratio": 1.5121951219512195, "no_speech_prob": 0.009996472857892513}, {"id": 683, "seek": 345228, "start": 3459.1600000000003, "end": 3465.0, "text": " Yeah. So just is seemingly GLM 5 already very impressive. This is a little incremental. Like if", "tokens": [50708, 865, 13, 407, 445, 307, 18709, 16225, 44, 1025, 1217, 588, 8992, 13, 639, 307, 257, 707, 35759, 13, 1743, 498, 51000], "temperature": 0.0, "avg_logprob": -0.18810754730587914, "compression_ratio": 1.5121951219512195, "no_speech_prob": 0.009996472857892513}, {"id": 684, "seek": 345228, "start": 3465.0, "end": 3473.88, "text": " you look at the benchmarks, it's a jump on benchmarks that is giving you like a 5 10% boost. But", "tokens": [51000, 291, 574, 412, 264, 43751, 11, 309, 311, 257, 3012, 322, 43751, 300, 307, 2902, 291, 411, 257, 1025, 1266, 4, 9194, 13, 583, 51444], "temperature": 0.0, "avg_logprob": -0.18810754730587914, "compression_ratio": 1.5121951219512195, "no_speech_prob": 0.009996472857892513}, {"id": 685, "seek": 345228, "start": 3473.88, "end": 3479.96, "text": " altogether it points to where continue to train and continue to get advancements beyond what", "tokens": [51444, 19051, 309, 2793, 281, 689, 2354, 281, 3847, 293, 2354, 281, 483, 7295, 1117, 4399, 437, 51748], "temperature": 0.0, "avg_logprob": -0.18810754730587914, "compression_ratio": 1.5121951219512195, "no_speech_prob": 0.009996472857892513}, {"id": 686, "seek": 347996, "start": 3479.96, "end": 3485.2400000000002, "text": " they already had. And GLM is a very, very powerful model. And it's all like kind of built on", "tokens": [50364, 436, 1217, 632, 13, 400, 16225, 44, 307, 257, 588, 11, 588, 4005, 2316, 13, 400, 309, 311, 439, 411, 733, 295, 3094, 322, 50628], "temperature": 0.0, "avg_logprob": -0.17674383569936283, "compression_ratio": 1.7048611111111112, "no_speech_prob": 0.0013454685686156154}, {"id": 687, "seek": 347996, "start": 3485.2400000000002, "end": 3489.2400000000002, "text": " something very similar to the deep seek stack. Right. So you can think this is like further validation", "tokens": [50628, 746, 588, 2531, 281, 264, 2452, 8075, 8630, 13, 1779, 13, 407, 291, 393, 519, 341, 307, 411, 3052, 24071, 50828], "temperature": 0.0, "avg_logprob": -0.17674383569936283, "compression_ratio": 1.7048611111111112, "no_speech_prob": 0.0013454685686156154}, {"id": 688, "seek": 347996, "start": 3489.2400000000002, "end": 3493.8, "text": " to the deep seek sparse attention approach, you know, all the kind of foundational pieces that", "tokens": [50828, 281, 264, 2452, 8075, 637, 11668, 3202, 3109, 11, 291, 458, 11, 439, 264, 733, 295, 32195, 3755, 300, 51056], "temperature": 0.0, "avg_logprob": -0.17674383569936283, "compression_ratio": 1.7048611111111112, "no_speech_prob": 0.0013454685686156154}, {"id": 689, "seek": 347996, "start": 3493.8, "end": 3499.64, "text": " they've been using that's you know, part of what this shows. And back to the US next we have Google", "tokens": [51056, 436, 600, 668, 1228, 300, 311, 291, 458, 11, 644, 295, 437, 341, 3110, 13, 400, 646, 281, 264, 2546, 958, 321, 362, 3329, 51348], "temperature": 0.0, "avg_logprob": -0.17674383569936283, "compression_ratio": 1.7048611111111112, "no_speech_prob": 0.0013454685686156154}, {"id": 690, "seek": 347996, "start": 3499.64, "end": 3507.7200000000003, "text": " announcing the Gemma for family of models. They have a few of them. So they have the effective to be", "tokens": [51348, 28706, 264, 22894, 1696, 337, 1605, 295, 5245, 13, 814, 362, 257, 1326, 295, 552, 13, 407, 436, 362, 264, 4942, 281, 312, 51752], "temperature": 0.0, "avg_logprob": -0.17674383569936283, "compression_ratio": 1.7048611111111112, "no_speech_prob": 0.0013454685686156154}, {"id": 691, "seek": 350772, "start": 3508.2799999999997, "end": 3514.2, "text": " effective for B. So these are tiny models that use Routh with your weights if you run on a single", "tokens": [50392, 4942, 337, 363, 13, 407, 613, 366, 5870, 5245, 300, 764, 497, 2178, 365, 428, 17443, 498, 291, 1190, 322, 257, 2167, 50688], "temperature": 0.0, "avg_logprob": -0.31348481339015316, "compression_ratio": 1.63135593220339, "no_speech_prob": 0.00857153907418251}, {"id": 692, "seek": 350772, "start": 3514.2, "end": 3522.12, "text": " which you also have a 26 billion mixture of experts model and a 31 billion dense model. This", "tokens": [50688, 597, 291, 611, 362, 257, 7551, 5218, 9925, 295, 8572, 2316, 293, 257, 10353, 5218, 18011, 2316, 13, 639, 51084], "temperature": 0.0, "avg_logprob": -0.31348481339015316, "compression_ratio": 1.63135593220339, "no_speech_prob": 0.00857153907418251}, {"id": 693, "seek": 350772, "start": 3522.12, "end": 3526.6, "text": " Gemma is the family of models that Google has developed for a while that has tenders to be on the", "tokens": [51084, 22894, 1696, 307, 264, 1605, 295, 5245, 300, 3329, 575, 4743, 337, 257, 1339, 300, 575, 3928, 433, 281, 312, 322, 264, 51308], "temperature": 0.0, "avg_logprob": -0.31348481339015316, "compression_ratio": 1.63135593220339, "no_speech_prob": 0.00857153907418251}, {"id": 694, "seek": 350772, "start": 3526.6, "end": 3533.3999999999996, "text": " smaller side 31 billion dense parameters is actually pretty large. They also released this under", "tokens": [51308, 4356, 1252, 10353, 5218, 18011, 9834, 307, 767, 1238, 2416, 13, 814, 611, 4736, 341, 833, 51648], "temperature": 0.0, "avg_logprob": -0.31348481339015316, "compression_ratio": 1.63135593220339, "no_speech_prob": 0.00857153907418251}, {"id": 695, "seek": 353340, "start": 3533.56, "end": 3539.7200000000003, "text": " Apache 2.0 license. They dropped their custom Gemma license which has various restrictions.", "tokens": [50372, 46597, 568, 13, 15, 10476, 13, 814, 8119, 641, 2375, 22894, 1696, 10476, 597, 575, 3683, 14191, 13, 50680], "temperature": 0.0, "avg_logprob": -0.1637007201590189, "compression_ratio": 1.608695652173913, "no_speech_prob": 0.004825454670935869}, {"id": 696, "seek": 353340, "start": 3540.28, "end": 3545.4, "text": " Apache 2.0 basically says you can do whatever you want as long as you acknowledge that you're", "tokens": [50708, 46597, 568, 13, 15, 1936, 1619, 291, 393, 360, 2035, 291, 528, 382, 938, 382, 291, 10692, 300, 291, 434, 50964], "temperature": 0.0, "avg_logprob": -0.1637007201590189, "compression_ratio": 1.608695652173913, "no_speech_prob": 0.004825454670935869}, {"id": 697, "seek": 353340, "start": 3545.4, "end": 3550.6800000000003, "text": " using this model. And it has some interesting. I don't want to get into technical details but", "tokens": [50964, 1228, 341, 2316, 13, 400, 309, 575, 512, 1880, 13, 286, 500, 380, 528, 281, 483, 666, 6191, 4365, 457, 51228], "temperature": 0.0, "avg_logprob": -0.1637007201590189, "compression_ratio": 1.608695652173913, "no_speech_prob": 0.004825454670935869}, {"id": 698, "seek": 353340, "start": 3550.6800000000003, "end": 3556.84, "text": " I've seen some analysis pointing to architecturally this making some interesting decisions", "tokens": [51228, 286, 600, 1612, 512, 5215, 12166, 281, 6331, 6512, 341, 1455, 512, 1880, 5327, 51536], "temperature": 0.0, "avg_logprob": -0.1637007201590189, "compression_ratio": 1.608695652173913, "no_speech_prob": 0.004825454670935869}, {"id": 699, "seek": 355684, "start": 3557.48, "end": 3565.08, "text": " with regards to how to set up a consumer etc. So if you look at your performance relative to the size", "tokens": [50396, 365, 14258, 281, 577, 281, 992, 493, 257, 9711, 5183, 13, 407, 498, 291, 574, 412, 428, 3389, 4972, 281, 264, 2744, 50776], "temperature": 0.0, "avg_logprob": -0.13951361456582712, "compression_ratio": 1.5661157024793388, "no_speech_prob": 0.01938389614224434}, {"id": 700, "seek": 355684, "start": 3565.88, "end": 3572.28, "text": " it seems to be doing quite an impressive job potentially because of these more like technical", "tokens": [50816, 309, 2544, 281, 312, 884, 1596, 364, 8992, 1691, 7263, 570, 295, 613, 544, 411, 6191, 51136], "temperature": 0.0, "avg_logprob": -0.13951361456582712, "compression_ratio": 1.5661157024793388, "no_speech_prob": 0.01938389614224434}, {"id": 701, "seek": 355684, "start": 3572.28, "end": 3577.8, "text": " minigree details. Yeah when the main philosophy here seems to be they're kind of saying like", "tokens": [51136, 923, 328, 701, 4365, 13, 865, 562, 264, 2135, 10675, 510, 2544, 281, 312, 436, 434, 733, 295, 1566, 411, 51412], "temperature": 0.0, "avg_logprob": -0.13951361456582712, "compression_ratio": 1.5661157024793388, "no_speech_prob": 0.01938389614224434}, {"id": 702, "seek": 355684, "start": 3578.6800000000003, "end": 3584.2000000000003, "text": " in previous versions of Gemma we had a whole bunch of really complex features that we were", "tokens": [51456, 294, 3894, 9606, 295, 22894, 1696, 321, 632, 257, 1379, 3840, 295, 534, 3997, 4122, 300, 321, 645, 51732], "temperature": 0.0, "avg_logprob": -0.13951361456582712, "compression_ratio": 1.5661157024793388, "no_speech_prob": 0.01938389614224434}, {"id": 703, "seek": 358420, "start": 3584.2, "end": 3589.16, "text": " baking into our architecture. And these include features like so one one that they've ripped out is", "tokens": [50364, 12102, 666, 527, 9482, 13, 400, 613, 4090, 4122, 411, 370, 472, 472, 300, 436, 600, 22780, 484, 307, 50612], "temperature": 0.0, "avg_logprob": -0.1452188050305402, "compression_ratio": 1.8549618320610688, "no_speech_prob": 0.005552619230002165}, {"id": 704, "seek": 358420, "start": 3589.16, "end": 3595.8799999999997, "text": " the thing called Altup where like you take a vector that comes into a layer of the model and well", "tokens": [50612, 264, 551, 1219, 15992, 1010, 689, 411, 291, 747, 257, 8062, 300, 1487, 666, 257, 4583, 295, 264, 2316, 293, 731, 50948], "temperature": 0.0, "avg_logprob": -0.1452188050305402, "compression_ratio": 1.8549618320610688, "no_speech_prob": 0.005552619230002165}, {"id": 705, "seek": 358420, "start": 3595.8799999999997, "end": 3601.3999999999996, "text": " the traditionally in a transformer every layer would chew on that that vector the residual stream", "tokens": [50948, 264, 19067, 294, 257, 31782, 633, 4583, 576, 21200, 322, 300, 300, 8062, 264, 27980, 4309, 51224], "temperature": 0.0, "avg_logprob": -0.1452188050305402, "compression_ratio": 1.8549618320610688, "no_speech_prob": 0.005552619230002165}, {"id": 706, "seek": 358420, "start": 3601.3999999999996, "end": 3607.08, "text": " and then spit out a new version of that whole vector what they do here is in an Altup they'll", "tokens": [51224, 293, 550, 22127, 484, 257, 777, 3037, 295, 300, 1379, 8062, 437, 436, 360, 510, 307, 294, 364, 15992, 1010, 436, 603, 51508], "temperature": 0.0, "avg_logprob": -0.1452188050305402, "compression_ratio": 1.8549618320610688, "no_speech_prob": 0.005552619230002165}, {"id": 707, "seek": 358420, "start": 3607.08, "end": 3612.2, "text": " like separate that vector into chunks and you know every every layer will only work on one chunk", "tokens": [51508, 411, 4994, 300, 8062, 666, 24004, 293, 291, 458, 633, 633, 4583, 486, 787, 589, 322, 472, 16635, 51764], "temperature": 0.0, "avg_logprob": -0.1452188050305402, "compression_ratio": 1.8549618320610688, "no_speech_prob": 0.005552619230002165}, {"id": 708, "seek": 361220, "start": 3612.2, "end": 3617.7999999999997, "text": " and the other part of the vector will proceed unimpeded. So that way the model kind of focuses more", "tokens": [50364, 293, 264, 661, 644, 295, 264, 8062, 486, 8991, 517, 332, 3452, 292, 13, 407, 300, 636, 264, 2316, 733, 295, 16109, 544, 50644], "temperature": 0.0, "avg_logprob": -0.10613135109960506, "compression_ratio": 1.7535714285714286, "no_speech_prob": 0.0005192525568418205}, {"id": 709, "seek": 361220, "start": 3617.7999999999997, "end": 3622.6, "text": " on one part of the representation than another at any given layer and lets you kind of make deeper", "tokens": [50644, 322, 472, 644, 295, 264, 10290, 813, 1071, 412, 604, 2212, 4583, 293, 6653, 291, 733, 295, 652, 7731, 50884], "temperature": 0.0, "avg_logprob": -0.10613135109960506, "compression_ratio": 1.7535714285714286, "no_speech_prob": 0.0005192525568418205}, {"id": 710, "seek": 361220, "start": 3622.6, "end": 3627.24, "text": " transformers than you otherwise would be able to. So they're throwing that out basically they just", "tokens": [50884, 4088, 433, 813, 291, 5911, 576, 312, 1075, 281, 13, 407, 436, 434, 10238, 300, 484, 1936, 436, 445, 51116], "temperature": 0.0, "avg_logprob": -0.10613135109960506, "compression_ratio": 1.7535714285714286, "no_speech_prob": 0.0005192525568418205}, {"id": 711, "seek": 361220, "start": 3627.24, "end": 3632.3599999999997, "text": " did they feel that it was inconclusive whether that actually helped or it wasn't conclusive enough", "tokens": [51116, 630, 436, 841, 300, 309, 390, 20972, 66, 7233, 1968, 300, 767, 4254, 420, 309, 2067, 380, 1588, 7233, 1547, 51372], "temperature": 0.0, "avg_logprob": -0.10613135109960506, "compression_ratio": 1.7535714285714286, "no_speech_prob": 0.0005192525568418205}, {"id": 712, "seek": 361220, "start": 3632.3599999999997, "end": 3637.16, "text": " and and their point here is really to take a step back and regularize their approach a bit say", "tokens": [51372, 293, 293, 641, 935, 510, 307, 534, 281, 747, 257, 1823, 646, 293, 3890, 1125, 641, 3109, 257, 857, 584, 51612], "temperature": 0.0, "avg_logprob": -0.10613135109960506, "compression_ratio": 1.7535714285714286, "no_speech_prob": 0.0005192525568418205}, {"id": 713, "seek": 363716, "start": 3637.16, "end": 3642.68, "text": " let's use a less complex approach that's just make it easier for people to work with this models", "tokens": [50364, 718, 311, 764, 257, 1570, 3997, 3109, 300, 311, 445, 652, 309, 3571, 337, 561, 281, 589, 365, 341, 5245, 50640], "temperature": 0.0, "avg_logprob": -0.1272987105629661, "compression_ratio": 1.7509025270758123, "no_speech_prob": 0.005909763742238283}, {"id": 714, "seek": 363716, "start": 3642.68, "end": 3648.2, "text": " let's janky and it's more compatible across libraries across devices more efficient and so on. So", "tokens": [50640, 718, 311, 361, 657, 88, 293, 309, 311, 544, 18218, 2108, 15148, 2108, 5759, 544, 7148, 293, 370, 322, 13, 407, 50916], "temperature": 0.0, "avg_logprob": -0.1272987105629661, "compression_ratio": 1.7509025270758123, "no_speech_prob": 0.005909763742238283}, {"id": 715, "seek": 363716, "start": 3648.2, "end": 3652.2799999999997, "text": " you're going to see them ditch a lot of those complicated approaches they do have this shared", "tokens": [50916, 291, 434, 516, 281, 536, 552, 25325, 257, 688, 295, 729, 6179, 11587, 436, 360, 362, 341, 5507, 51120], "temperature": 0.0, "avg_logprob": -0.1272987105629661, "compression_ratio": 1.7509025270758123, "no_speech_prob": 0.005909763742238283}, {"id": 716, "seek": 363716, "start": 3652.2799999999997, "end": 3658.68, "text": " kvcash where the last few layers of the model are going to reuse keys and value states from earlier", "tokens": [51120, 350, 85, 66, 1299, 689, 264, 1036, 1326, 7914, 295, 264, 2316, 366, 516, 281, 26225, 9317, 293, 2158, 4368, 490, 3071, 51440], "temperature": 0.0, "avg_logprob": -0.1272987105629661, "compression_ratio": 1.7509025270758123, "no_speech_prob": 0.005909763742238283}, {"id": 717, "seek": 363716, "start": 3658.68, "end": 3663.72, "text": " layers instead of computing that their own key and value projections. So basically that you know", "tokens": [51440, 7914, 2602, 295, 15866, 300, 641, 1065, 2141, 293, 2158, 32371, 13, 407, 1936, 300, 291, 458, 51692], "temperature": 0.0, "avg_logprob": -0.1272987105629661, "compression_ratio": 1.7509025270758123, "no_speech_prob": 0.005909763742238283}, {"id": 718, "seek": 366372, "start": 3663.72, "end": 3669.24, "text": " the key is the thing that tells the model hey this is the information that this token can offer.", "tokens": [50364, 264, 2141, 307, 264, 551, 300, 5112, 264, 2316, 4177, 341, 307, 264, 1589, 300, 341, 14862, 393, 2626, 13, 50640], "temperature": 0.0, "avg_logprob": -0.12101771036783854, "compression_ratio": 2.008298755186722, "no_speech_prob": 0.00669194757938385}, {"id": 719, "seek": 366372, "start": 3669.24, "end": 3673.08, "text": " So if you're trying to analyze the text and decide you know how much should I pay attention to", "tokens": [50640, 407, 498, 291, 434, 1382, 281, 12477, 264, 2487, 293, 4536, 291, 458, 577, 709, 820, 286, 1689, 3202, 281, 50832], "temperature": 0.0, "avg_logprob": -0.12101771036783854, "compression_ratio": 2.008298755186722, "no_speech_prob": 0.00669194757938385}, {"id": 720, "seek": 366372, "start": 3673.08, "end": 3678.2, "text": " this token the key says hey this is the kind of information this token contains the value", "tokens": [50832, 341, 14862, 264, 2141, 1619, 4177, 341, 307, 264, 733, 295, 1589, 341, 14862, 8306, 264, 2158, 51088], "temperature": 0.0, "avg_logprob": -0.12101771036783854, "compression_ratio": 2.008298755186722, "no_speech_prob": 0.00669194757938385}, {"id": 721, "seek": 366372, "start": 3678.2, "end": 3683.0, "text": " one information that the token contains both of those things are being frozen basically for the last", "tokens": [51088, 472, 1589, 300, 264, 14862, 8306, 1293, 295, 729, 721, 366, 885, 12496, 1936, 337, 264, 1036, 51328], "temperature": 0.0, "avg_logprob": -0.12101771036783854, "compression_ratio": 2.008298755186722, "no_speech_prob": 0.00669194757938385}, {"id": 722, "seek": 366372, "start": 3683.0, "end": 3688.7599999999998, "text": " few layers they don't evolve what does evolve is the query right the thing that says what information", "tokens": [51328, 1326, 7914, 436, 500, 380, 16693, 437, 775, 16693, 307, 264, 14581, 558, 264, 551, 300, 1619, 437, 1589, 51616], "temperature": 0.0, "avg_logprob": -0.12101771036783854, "compression_ratio": 2.008298755186722, "no_speech_prob": 0.00669194757938385}, {"id": 723, "seek": 368876, "start": 3688.76, "end": 3694.0400000000004, "text": " am I looking for to basically pump out my output at any given layer and so they're doing that", "tokens": [50364, 669, 286, 1237, 337, 281, 1936, 5889, 484, 452, 5598, 412, 604, 2212, 4583, 293, 370, 436, 434, 884, 300, 50628], "temperature": 0.0, "avg_logprob": -0.09795393155315729, "compression_ratio": 1.7545454545454546, "no_speech_prob": 0.004681129474192858}, {"id": 724, "seek": 368876, "start": 3694.0400000000004, "end": 3700.36, "text": " should kvcash and this is really just like focusing down on and it has basically no effect when they", "tokens": [50628, 820, 350, 85, 66, 1299, 293, 341, 307, 534, 445, 411, 8416, 760, 322, 293, 309, 575, 1936, 572, 1802, 562, 436, 50944], "temperature": 0.0, "avg_logprob": -0.09795393155315729, "compression_ratio": 1.7545454545454546, "no_speech_prob": 0.004681129474192858}, {"id": 725, "seek": 368876, "start": 3700.36, "end": 3703.8, "text": " when they do that which is quite remarkable makes you realize how much compute use during training", "tokens": [50944, 562, 436, 360, 300, 597, 307, 1596, 12802, 1669, 291, 4325, 577, 709, 14722, 764, 1830, 3097, 51116], "temperature": 0.0, "avg_logprob": -0.09795393155315729, "compression_ratio": 1.7545454545454546, "no_speech_prob": 0.004681129474192858}, {"id": 726, "seek": 368876, "start": 3703.8, "end": 3708.2000000000003, "text": " is probably being wasted there's just so much software based optimization like that that's left to do", "tokens": [51116, 307, 1391, 885, 19496, 456, 311, 445, 370, 709, 4722, 2361, 19618, 411, 300, 300, 311, 1411, 281, 360, 51336], "temperature": 0.0, "avg_logprob": -0.09795393155315729, "compression_ratio": 1.7545454545454546, "no_speech_prob": 0.004681129474192858}, {"id": 727, "seek": 368876, "start": 3708.2000000000003, "end": 3712.2000000000003, "text": " but yeah so there are a bunch of things like that one thing of note here is that the 31 billion", "tokens": [51336, 457, 1338, 370, 456, 366, 257, 3840, 295, 721, 411, 300, 472, 551, 295, 3637, 510, 307, 300, 264, 10353, 5218, 51536], "temperature": 0.0, "avg_logprob": -0.09795393155315729, "compression_ratio": 1.7545454545454546, "no_speech_prob": 0.004681129474192858}, {"id": 728, "seek": 368876, "start": 3712.2000000000003, "end": 3718.0400000000004, "text": " parameter model currently ranks third among open source models globally on the arena AI", "tokens": [51536, 13075, 2316, 4362, 21406, 2636, 3654, 1269, 4009, 5245, 18958, 322, 264, 18451, 7318, 51828], "temperature": 0.0, "avg_logprob": -0.09795393155315729, "compression_ratio": 1.7545454545454546, "no_speech_prob": 0.004681129474192858}, {"id": 729, "seek": 371804, "start": 3718.04, "end": 3724.68, "text": " text leaderboard so the the number one and number two slots there go to gllm5 which is an MOE", "tokens": [50364, 2487, 5263, 3787, 370, 264, 264, 1230, 472, 293, 1230, 732, 24266, 456, 352, 281, 290, 285, 76, 20, 597, 307, 364, 19290, 36, 50696], "temperature": 0.0, "avg_logprob": -0.2034155038686899, "compression_ratio": 1.6893617021276597, "no_speech_prob": 0.0033761111553758383}, {"id": 730, "seek": 371804, "start": 3724.68, "end": 3731.32, "text": " model so it's actually like way bigger on nominal parameter count 744 billion kineke 2.5 thinking is", "tokens": [50696, 2316, 370, 309, 311, 767, 411, 636, 3801, 322, 41641, 13075, 1207, 1614, 13912, 5218, 350, 533, 330, 568, 13, 20, 1953, 307, 51028], "temperature": 0.0, "avg_logprob": -0.2034155038686899, "compression_ratio": 1.6893617021276597, "no_speech_prob": 0.0033761111553758383}, {"id": 731, "seek": 371804, "start": 3731.32, "end": 3736.7599999999998, "text": " number two that's a trillion parameter model as well but both of those have between 30 and 40 billion", "tokens": [51028, 1230, 732, 300, 311, 257, 18723, 13075, 2316, 382, 731, 457, 1293, 295, 729, 362, 1296, 2217, 293, 3356, 5218, 51300], "temperature": 0.0, "avg_logprob": -0.2034155038686899, "compression_ratio": 1.6893617021276597, "no_speech_prob": 0.0033761111553758383}, {"id": 732, "seek": 371804, "start": 3736.7599999999998, "end": 3741.72, "text": " active parameters during inference so actually from an active parameter standpoint pretty similar to", "tokens": [51300, 4967, 9834, 1830, 38253, 370, 767, 490, 364, 4967, 13075, 15827, 1238, 2531, 281, 51548], "temperature": 0.0, "avg_logprob": -0.2034155038686899, "compression_ratio": 1.6893617021276597, "no_speech_prob": 0.0033761111553758383}, {"id": 733, "seek": 374172, "start": 3741.7999999999997, "end": 3748.9199999999996, "text": " to jema 431b so you know in that sense maybe not such a such a crazy crazy delta but again jema 4", "tokens": [50368, 281, 361, 5619, 1017, 12967, 65, 370, 291, 458, 294, 300, 2020, 1310, 406, 1270, 257, 1270, 257, 3219, 3219, 8289, 457, 797, 361, 5619, 1017, 50724], "temperature": 0.0, "avg_logprob": -0.15525969752558955, "compression_ratio": 1.7835820895522387, "no_speech_prob": 0.0015245843678712845}, {"id": 734, "seek": 374172, "start": 3748.9199999999996, "end": 3753.48, "text": " is just a 31 billion parameter model you don't need the memory to to hold on to everything so", "tokens": [50724, 307, 445, 257, 10353, 5218, 13075, 2316, 291, 500, 380, 643, 264, 4675, 281, 281, 1797, 322, 281, 1203, 370, 50952], "temperature": 0.0, "avg_logprob": -0.15525969752558955, "compression_ratio": 1.7835820895522387, "no_speech_prob": 0.0015245843678712845}, {"id": 735, "seek": 374172, "start": 3753.48, "end": 3758.04, "text": " kind of interesting in that respect it is pound per pound or parameter for parameter you know", "tokens": [50952, 733, 295, 1880, 294, 300, 3104, 309, 307, 12013, 680, 12013, 420, 13075, 337, 13075, 291, 458, 51180], "temperature": 0.0, "avg_logprob": -0.15525969752558955, "compression_ratio": 1.7835820895522387, "no_speech_prob": 0.0015245843678712845}, {"id": 736, "seek": 374172, "start": 3758.04, "end": 3763.3199999999997, "text": " certainly the most intelligence we've seen so far it seems on on that leaderboard and through", "tokens": [51180, 3297, 264, 881, 7599, 321, 600, 1612, 370, 1400, 309, 2544, 322, 322, 300, 5263, 3787, 293, 807, 51444], "temperature": 0.0, "avg_logprob": -0.15525969752558955, "compression_ratio": 1.7835820895522387, "no_speech_prob": 0.0015245843678712845}, {"id": 737, "seek": 374172, "start": 3763.3199999999997, "end": 3770.04, "text": " other benchmarks right and in particular also the two billion and four billion effective parameter", "tokens": [51444, 661, 43751, 558, 293, 294, 1729, 611, 264, 732, 5218, 293, 1451, 5218, 4942, 13075, 51780], "temperature": 0.0, "avg_logprob": -0.15525969752558955, "compression_ratio": 1.7835820895522387, "no_speech_prob": 0.0015245843678712845}, {"id": 738, "seek": 377004, "start": 3770.04, "end": 3777.08, "text": " models are ones that seemingly could be used on your phone like truly truly device local yes", "tokens": [50364, 5245, 366, 2306, 300, 18709, 727, 312, 1143, 322, 428, 2593, 411, 4908, 4908, 4302, 2654, 2086, 50716], "temperature": 0.0, "avg_logprob": -0.15528305278104895, "compression_ratio": 1.6863636363636363, "no_speech_prob": 0.01000172458589077}, {"id": 739, "seek": 377004, "start": 3777.08, "end": 3783.0, "text": " and that is something we highlight in the blog post and i've seen some discussions on reddit", "tokens": [50716, 293, 300, 307, 746, 321, 5078, 294, 264, 6968, 2183, 293, 741, 600, 1612, 512, 11088, 322, 2182, 17975, 51012], "temperature": 0.0, "avg_logprob": -0.15528305278104895, "compression_ratio": 1.6863636363636363, "no_speech_prob": 0.01000172458589077}, {"id": 740, "seek": 377004, "start": 3783.0, "end": 3788.6, "text": " and elsewhere for people who are into local lm's that this actually seems to work well and practice", "tokens": [51012, 293, 14517, 337, 561, 567, 366, 666, 2654, 287, 76, 311, 300, 341, 767, 2544, 281, 589, 731, 293, 3124, 51292], "temperature": 0.0, "avg_logprob": -0.15528305278104895, "compression_ratio": 1.6863636363636363, "no_speech_prob": 0.01000172458589077}, {"id": 741, "seek": 377004, "start": 3788.6, "end": 3797.0, "text": " so thus yeah seem like a pretty good step for local AI as something you can try to do", "tokens": [51292, 370, 8807, 1338, 1643, 411, 257, 1238, 665, 1823, 337, 2654, 7318, 382, 746, 291, 393, 853, 281, 360, 51712], "temperature": 0.0, "avg_logprob": -0.15528305278104895, "compression_ratio": 1.6863636363636363, "no_speech_prob": 0.01000172458589077}, {"id": 742, "seek": 379700, "start": 3797.16, "end": 3801.72, "text": " well one of the key things too for those those smaller models is they do use this thing called", "tokens": [50372, 731, 472, 295, 264, 2141, 721, 886, 337, 729, 729, 4356, 5245, 307, 436, 360, 764, 341, 551, 1219, 50600], "temperature": 0.0, "avg_logprob": -0.12637539280271068, "compression_ratio": 1.9019607843137254, "no_speech_prob": 0.005467538721859455}, {"id": 743, "seek": 379700, "start": 3801.72, "end": 3807.24, "text": " per layer embeddings which is actually worth mentioning very briefly you're typically when you feed", "tokens": [50600, 680, 4583, 12240, 29432, 597, 307, 767, 3163, 18315, 588, 10515, 291, 434, 5850, 562, 291, 3154, 50876], "temperature": 0.0, "avg_logprob": -0.12637539280271068, "compression_ratio": 1.9019607843137254, "no_speech_prob": 0.005467538721859455}, {"id": 744, "seek": 379700, "start": 3807.24, "end": 3813.8, "text": " your your text to a model you basically turn each token into an embedding right and then you have", "tokens": [50876, 428, 428, 2487, 281, 257, 2316, 291, 1936, 1261, 1184, 14862, 666, 364, 12240, 3584, 558, 293, 550, 291, 362, 51204], "temperature": 0.0, "avg_logprob": -0.12637539280271068, "compression_ratio": 1.9019607843137254, "no_speech_prob": 0.005467538721859455}, {"id": 745, "seek": 379700, "start": 3813.8, "end": 3818.2, "text": " a fixed embedding per token and then those embeddings get chewed on through all the layers and", "tokens": [51204, 257, 6806, 12240, 3584, 680, 14862, 293, 550, 729, 12240, 29432, 483, 21200, 292, 322, 807, 439, 264, 7914, 293, 51424], "temperature": 0.0, "avg_logprob": -0.12637539280271068, "compression_ratio": 1.9019607843137254, "no_speech_prob": 0.005467538721859455}, {"id": 746, "seek": 379700, "start": 3818.2, "end": 3824.52, "text": " modified to produce your output the problem is that different layers might actually be interested", "tokens": [51424, 15873, 281, 5258, 428, 5598, 264, 1154, 307, 300, 819, 7914, 1062, 767, 312, 3102, 51740], "temperature": 0.0, "avg_logprob": -0.12637539280271068, "compression_ratio": 1.9019607843137254, "no_speech_prob": 0.005467538721859455}, {"id": 747, "seek": 382452, "start": 3824.68, "end": 3829.32, "text": " in pulling out different information from a token and if you only have one embedding at the", "tokens": [50372, 294, 8407, 484, 819, 1589, 490, 257, 14862, 293, 498, 291, 787, 362, 472, 12240, 3584, 412, 264, 50604], "temperature": 0.0, "avg_logprob": -0.10683635462110287, "compression_ratio": 1.7785977859778597, "no_speech_prob": 0.003883680095896125}, {"id": 748, "seek": 382452, "start": 3829.32, "end": 3833.96, "text": " beginning the embedding has to carry all the information that'll ever be required at any layer", "tokens": [50604, 2863, 264, 12240, 3584, 575, 281, 3985, 439, 264, 1589, 300, 603, 1562, 312, 4739, 412, 604, 4583, 50836], "temperature": 0.0, "avg_logprob": -0.10683635462110287, "compression_ratio": 1.7785977859778597, "no_speech_prob": 0.003883680095896125}, {"id": 749, "seek": 382452, "start": 3833.96, "end": 3838.04, "text": " of the network going for it's it's got to be an embedding that is simultaneously built to fit the", "tokens": [50836, 295, 264, 3209, 516, 337, 309, 311, 309, 311, 658, 281, 312, 364, 12240, 3584, 300, 307, 16561, 3094, 281, 3318, 264, 51040], "temperature": 0.0, "avg_logprob": -0.10683635462110287, "compression_ratio": 1.7785977859778597, "no_speech_prob": 0.003883680095896125}, {"id": 750, "seek": 382452, "start": 3838.04, "end": 3843.64, "text": " needs of every subsequent layer in the network and so what they're doing here is this PLE approach", "tokens": [51040, 2203, 295, 633, 19962, 4583, 294, 264, 3209, 293, 370, 437, 436, 434, 884, 510, 307, 341, 6999, 36, 3109, 51320], "temperature": 0.0, "avg_logprob": -0.10683635462110287, "compression_ratio": 1.7785977859778597, "no_speech_prob": 0.003883680095896125}, {"id": 751, "seek": 382452, "start": 3843.64, "end": 3850.7599999999998, "text": " basically gives every layer its own dedicated little chunk of embedding space to represent its own", "tokens": [51320, 1936, 2709, 633, 4583, 1080, 1065, 8374, 707, 16635, 295, 12240, 3584, 1901, 281, 2906, 1080, 1065, 51676], "temperature": 0.0, "avg_logprob": -0.10683635462110287, "compression_ratio": 1.7785977859778597, "no_speech_prob": 0.003883680095896125}, {"id": 752, "seek": 385076, "start": 3850.76, "end": 3855.48, "text": " little part of the embedding that's customized to its needs so you know feed and you token in", "tokens": [50364, 707, 644, 295, 264, 12240, 3584, 300, 311, 30581, 281, 1080, 2203, 370, 291, 458, 3154, 293, 291, 14862, 294, 50600], "temperature": 0.0, "avg_logprob": -0.10596371533577903, "compression_ratio": 1.7657992565055762, "no_speech_prob": 0.002510941121727228}, {"id": 753, "seek": 385076, "start": 3855.48, "end": 3859.32, "text": " you have the embedding for that token at the bottom the kind of universal part of it but then", "tokens": [50600, 291, 362, 264, 12240, 3584, 337, 300, 14862, 412, 264, 2767, 264, 733, 295, 11455, 644, 295, 309, 457, 550, 50792], "temperature": 0.0, "avg_logprob": -0.10596371533577903, "compression_ratio": 1.7657992565055762, "no_speech_prob": 0.002510941121727228}, {"id": 754, "seek": 385076, "start": 3859.32, "end": 3864.44, "text": " every layer also has a an embedding value associated with it and that's that's used only to as", "tokens": [50792, 633, 4583, 611, 575, 257, 364, 12240, 3584, 2158, 6615, 365, 309, 293, 300, 311, 300, 311, 1143, 787, 281, 382, 51048], "temperature": 0.0, "avg_logprob": -0.10596371533577903, "compression_ratio": 1.7657992565055762, "no_speech_prob": 0.002510941121727228}, {"id": 755, "seek": 385076, "start": 3864.44, "end": 3868.44, "text": " an optimization for these smaller models and it's a big part of the the success case for this model", "tokens": [51048, 364, 19618, 337, 613, 4356, 5245, 293, 309, 311, 257, 955, 644, 295, 264, 264, 2245, 1389, 337, 341, 2316, 51248], "temperature": 0.0, "avg_logprob": -0.10596371533577903, "compression_ratio": 1.7657992565055762, "no_speech_prob": 0.002510941121727228}, {"id": 756, "seek": 385076, "start": 3869.0800000000004, "end": 3876.76, "text": " and one last open source story we covered glem 5.1 about the same time I think just slightly", "tokens": [51280, 293, 472, 1036, 1269, 4009, 1657, 321, 5343, 290, 10386, 1025, 13, 16, 466, 264, 912, 565, 286, 519, 445, 4748, 51664], "temperature": 0.0, "avg_logprob": -0.10596371533577903, "compression_ratio": 1.7657992565055762, "no_speech_prob": 0.002510941121727228}, {"id": 757, "seek": 387676, "start": 3876.76, "end": 3887.2400000000002, "text": " earlier Zidari also launched glem 5b turbo which perv there is multimodal model it is a step away", "tokens": [50364, 3071, 1176, 327, 3504, 611, 8730, 290, 10386, 1025, 65, 20902, 597, 680, 85, 456, 307, 32972, 378, 304, 2316, 309, 307, 257, 1823, 1314, 50888], "temperature": 0.0, "avg_logprob": -0.25009788261665095, "compression_ratio": 1.6666666666666667, "no_speech_prob": 0.008561640046536922}, {"id": 758, "seek": 387676, "start": 3887.2400000000002, "end": 3893.1600000000003, "text": " from to get silky technical basically it has a native multimodal fusion which means that", "tokens": [50888, 490, 281, 483, 3425, 4133, 6191, 1936, 309, 575, 257, 8470, 32972, 378, 304, 23100, 597, 1355, 300, 51184], "temperature": 0.0, "avg_logprob": -0.25009788261665095, "compression_ratio": 1.6666666666666667, "no_speech_prob": 0.008561640046536922}, {"id": 759, "seek": 387676, "start": 3893.1600000000003, "end": 3899.96, "text": " text and images and so on are just fed into it kind of in the same way without having separate modules", "tokens": [51184, 2487, 293, 5267, 293, 370, 322, 366, 445, 4636, 666, 309, 733, 295, 294, 264, 912, 636, 1553, 1419, 4994, 16679, 51524], "temperature": 0.0, "avg_logprob": -0.25009788261665095, "compression_ratio": 1.6666666666666667, "no_speech_prob": 0.008561640046536922}, {"id": 760, "seek": 387676, "start": 3899.96, "end": 3906.1200000000003, "text": " and this is sort of the way things were going in many different models that originally are", "tokens": [51524, 293, 341, 307, 1333, 295, 264, 636, 721, 645, 516, 294, 867, 819, 5245, 300, 7993, 366, 51832], "temperature": 0.0, "avg_logprob": -0.25009788261665095, "compression_ratio": 1.6666666666666667, "no_speech_prob": 0.008561640046536922}, {"id": 761, "seek": 390612, "start": 3906.12, "end": 3911.88, "text": " different encoders and you have to sort of merge them and a simplified kind of just basic", "tokens": [50364, 819, 2058, 378, 433, 293, 291, 362, 281, 1333, 295, 22183, 552, 293, 257, 26335, 733, 295, 445, 3875, 50652], "temperature": 0.0, "avg_logprob": -0.14038019056444045, "compression_ratio": 1.7616822429906542, "no_speech_prob": 0.0015480377478525043}, {"id": 762, "seek": 390612, "start": 3911.88, "end": 3918.7599999999998, "text": " transformer with token stream appears to work better this is in that family and appears to work", "tokens": [50652, 31782, 365, 14862, 4309, 7038, 281, 589, 1101, 341, 307, 294, 300, 1605, 293, 7038, 281, 589, 50996], "temperature": 0.0, "avg_logprob": -0.14038019056444045, "compression_ratio": 1.7616822429906542, "no_speech_prob": 0.0015480377478525043}, {"id": 763, "seek": 390612, "start": 3918.7599999999998, "end": 3925.7999999999997, "text": " quite well for things that require screenshots or things that we I believe covered also cloud and", "tokens": [50996, 1596, 731, 337, 721, 300, 3651, 40661, 420, 721, 300, 321, 286, 1697, 5343, 611, 4588, 293, 51348], "temperature": 0.0, "avg_logprob": -0.14038019056444045, "compression_ratio": 1.7616822429906542, "no_speech_prob": 0.0015480377478525043}, {"id": 764, "seek": 390612, "start": 3925.7999999999997, "end": 3931.64, "text": " hopefully I also highlighting like working with images and screenshots and screen sharing and", "tokens": [51348, 4696, 286, 611, 26551, 411, 1364, 365, 5267, 293, 40661, 293, 2568, 5414, 293, 51640], "temperature": 0.0, "avg_logprob": -0.14038019056444045, "compression_ratio": 1.7616822429906542, "no_speech_prob": 0.0015480377478525043}, {"id": 765, "seek": 393164, "start": 3931.64, "end": 3937.24, "text": " so on this would be capable of yeah and in that multimodality so important for computer usage", "tokens": [50364, 370, 322, 341, 576, 312, 8189, 295, 1338, 293, 294, 300, 32972, 378, 304, 507, 370, 1021, 337, 3820, 14924, 50644], "temperature": 0.0, "avg_logprob": -0.10629337205799348, "compression_ratio": 1.9090909090909092, "no_speech_prob": 0.002590435091406107}, {"id": 766, "seek": 393164, "start": 3937.24, "end": 3940.92, "text": " where you know as you say you want to be able to take a screenshot and then turn that into code", "tokens": [50644, 689, 291, 458, 382, 291, 584, 291, 528, 281, 312, 1075, 281, 747, 257, 27712, 293, 550, 1261, 300, 666, 3089, 50828], "temperature": 0.0, "avg_logprob": -0.10629337205799348, "compression_ratio": 1.9090909090909092, "no_speech_prob": 0.002590435091406107}, {"id": 767, "seek": 393164, "start": 3940.92, "end": 3948.2, "text": " and vice versa the challenges historically been when you optimize for one capability say multimodality", "tokens": [50828, 293, 11964, 25650, 264, 4759, 16180, 668, 562, 291, 19719, 337, 472, 13759, 584, 32972, 378, 1860, 51192], "temperature": 0.0, "avg_logprob": -0.10629337205799348, "compression_ratio": 1.9090909090909092, "no_speech_prob": 0.002590435091406107}, {"id": 768, "seek": 393164, "start": 3948.2, "end": 3953.24, "text": " you end up optimizing against the other one would say coding right so if you want to coding", "tokens": [51192, 291, 917, 493, 40425, 1970, 264, 661, 472, 576, 584, 17720, 558, 370, 498, 291, 528, 281, 17720, 51444], "temperature": 0.0, "avg_logprob": -0.10629337205799348, "compression_ratio": 1.9090909090909092, "no_speech_prob": 0.002590435091406107}, {"id": 769, "seek": 393164, "start": 3953.24, "end": 3958.3599999999997, "text": " maximize model you're going to have one that tends to suck at multimodality and vice versa because", "tokens": [51444, 19874, 2316, 291, 434, 516, 281, 362, 472, 300, 12258, 281, 9967, 412, 32972, 378, 304, 507, 293, 11964, 25650, 570, 51700], "temperature": 0.0, "avg_logprob": -0.10629337205799348, "compression_ratio": 1.9090909090909092, "no_speech_prob": 0.002590435091406107}, {"id": 770, "seek": 395836, "start": 3958.36, "end": 3963.0, "text": " of catastrophic forgetting right we talked about that to death on the show and so the the achievement", "tokens": [50364, 295, 34915, 25428, 558, 321, 2825, 466, 300, 281, 2966, 322, 264, 855, 293, 370, 264, 264, 15838, 50596], "temperature": 0.0, "avg_logprob": -0.08404816521538629, "compression_ratio": 1.880184331797235, "no_speech_prob": 0.0028891349211335182}, {"id": 771, "seek": 395836, "start": 3963.0, "end": 3968.6, "text": " here is to say well we can actually do both at the same time so this isn't so much about any particular", "tokens": [50596, 510, 307, 281, 584, 731, 321, 393, 767, 360, 1293, 412, 264, 912, 565, 370, 341, 1943, 380, 370, 709, 466, 604, 1729, 50876], "temperature": 0.0, "avg_logprob": -0.08404816521538629, "compression_ratio": 1.880184331797235, "no_speech_prob": 0.0028891349211335182}, {"id": 772, "seek": 395836, "start": 3968.6, "end": 3976.76, "text": " benchmark as is nominally or as it should be nominally the combination of a proof point on say design", "tokens": [50876, 18927, 382, 307, 5369, 19801, 420, 382, 309, 820, 312, 5369, 19801, 264, 6562, 295, 257, 8177, 935, 322, 584, 1715, 51284], "temperature": 0.0, "avg_logprob": -0.08404816521538629, "compression_ratio": 1.880184331797235, "no_speech_prob": 0.0028891349211335182}, {"id": 773, "seek": 395836, "start": 3976.76, "end": 3981.4, "text": " capability and a proof point on code capability and the proof point on design capability they have a", "tokens": [51284, 13759, 293, 257, 8177, 935, 322, 3089, 13759, 293, 264, 8177, 935, 322, 1715, 13759, 436, 362, 257, 51516], "temperature": 0.0, "avg_logprob": -0.08404816521538629, "compression_ratio": 1.880184331797235, "no_speech_prob": 0.0028891349211335182}, {"id": 774, "seek": 398140, "start": 3981.4, "end": 3990.76, "text": " self reported design to code benchmark score of 94.8 versus quad opus 4.6 is 77.3 that is a huge gap", "tokens": [50364, 2698, 7055, 1715, 281, 3089, 18927, 6175, 295, 30849, 13, 23, 5717, 10787, 999, 301, 1017, 13, 21, 307, 25546, 13, 18, 300, 307, 257, 2603, 7417, 50832], "temperature": 0.0, "avg_logprob": -0.0866690096647843, "compression_ratio": 1.6363636363636365, "no_speech_prob": 0.0019266061717644334}, {"id": 775, "seek": 398140, "start": 3990.76, "end": 3996.76, "text": " just to give you a sense that benchmark basically takes a whole bunch of manually curated web pages", "tokens": [50832, 445, 281, 976, 291, 257, 2020, 300, 18927, 1936, 2516, 257, 1379, 3840, 295, 16945, 47851, 3670, 7183, 51132], "temperature": 0.0, "avg_logprob": -0.0866690096647843, "compression_ratio": 1.6363636363636365, "no_speech_prob": 0.0019266061717644334}, {"id": 776, "seek": 398140, "start": 3996.76, "end": 4002.28, "text": " and you give the model a screenshot of those websites and you ask it to generate the HTML CSS code", "tokens": [51132, 293, 291, 976, 264, 2316, 257, 27712, 295, 729, 12891, 293, 291, 1029, 309, 281, 8460, 264, 17995, 24387, 3089, 51408], "temperature": 0.0, "avg_logprob": -0.0866690096647843, "compression_ratio": 1.6363636363636365, "no_speech_prob": 0.0019266061717644334}, {"id": 777, "seek": 398140, "start": 4002.28, "end": 4006.52, "text": " that when you render it should reproduce the original page so basically like here's a screenshot", "tokens": [51408, 300, 562, 291, 15529, 309, 820, 29501, 264, 3380, 3028, 370, 1936, 411, 510, 311, 257, 27712, 51620], "temperature": 0.0, "avg_logprob": -0.0866690096647843, "compression_ratio": 1.6363636363636365, "no_speech_prob": 0.0019266061717644334}, {"id": 778, "seek": 400652, "start": 4006.6, "end": 4012.44, "text": " reproduce the code behind this website and again on that benchmark it just crushes quad opus 4.6", "tokens": [50368, 29501, 264, 3089, 2261, 341, 3144, 293, 797, 322, 300, 18927, 309, 445, 10321, 279, 10787, 999, 301, 1017, 13, 21, 50660], "temperature": 0.0, "avg_logprob": -0.06059294009427412, "compression_ratio": 1.7330960854092528, "no_speech_prob": 0.006588107906281948}, {"id": 779, "seek": 400652, "start": 4012.44, "end": 4018.84, "text": " really really big deal the question is not though can you kind of beat quad on that particular", "tokens": [50660, 534, 534, 955, 2028, 264, 1168, 307, 406, 1673, 393, 291, 733, 295, 4224, 10787, 322, 300, 1729, 50980], "temperature": 0.0, "avg_logprob": -0.06059294009427412, "compression_ratio": 1.7330960854092528, "no_speech_prob": 0.006588107906281948}, {"id": 780, "seek": 400652, "start": 4018.84, "end": 4024.92, "text": " benchmark it's can you do it while also keeping your performance on coding really high that's where", "tokens": [50980, 18927, 309, 311, 393, 291, 360, 309, 1339, 611, 5145, 428, 3389, 322, 17720, 534, 1090, 300, 311, 689, 51284], "temperature": 0.0, "avg_logprob": -0.06059294009427412, "compression_ratio": 1.7330960854092528, "no_speech_prob": 0.006588107906281948}, {"id": 781, "seek": 400652, "start": 4024.92, "end": 4029.96, "text": " things get a little bit more ambiguous they don't report the kinds of benchmarks at least in this", "tokens": [51284, 721, 483, 257, 707, 857, 544, 39465, 436, 500, 380, 2275, 264, 3685, 295, 43751, 412, 1935, 294, 341, 51536], "temperature": 0.0, "avg_logprob": -0.06059294009427412, "compression_ratio": 1.7330960854092528, "no_speech_prob": 0.006588107906281948}, {"id": 782, "seek": 400652, "start": 4029.96, "end": 4034.44, "text": " report that I would expect to see when we're talking about code we don't see sweet bench verified", "tokens": [51536, 2275, 300, 286, 576, 2066, 281, 536, 562, 321, 434, 1417, 466, 3089, 321, 500, 380, 536, 3844, 10638, 31197, 51760], "temperature": 0.0, "avg_logprob": -0.06059294009427412, "compression_ratio": 1.7330960854092528, "no_speech_prob": 0.006588107906281948}, {"id": 783, "seek": 403444, "start": 4034.44, "end": 4041.2400000000002, "text": " for example that's kind of odd they cite this kind of internal cc bench v2 coding benchmark that we", "tokens": [50364, 337, 1365, 300, 311, 733, 295, 7401, 436, 37771, 341, 733, 295, 6920, 269, 66, 10638, 371, 17, 17720, 18927, 300, 321, 50704], "temperature": 0.0, "avg_logprob": -0.10162187877454255, "compression_ratio": 1.8863636363636365, "no_speech_prob": 0.003944660536944866}, {"id": 784, "seek": 403444, "start": 4041.2400000000002, "end": 4046.2000000000003, "text": " don't get to see and they say that looks just as good as it did for you know earlier versions that", "tokens": [50704, 500, 380, 483, 281, 536, 293, 436, 584, 300, 1542, 445, 382, 665, 382, 309, 630, 337, 291, 458, 3071, 9606, 300, 50952], "temperature": 0.0, "avg_logprob": -0.10162187877454255, "compression_ratio": 1.8863636363636365, "no_speech_prob": 0.003944660536944866}, {"id": 785, "seek": 403444, "start": 4046.2000000000003, "end": 4051.88, "text": " were kind of more code oriented so maybe good but there's like there's something sus here about not", "tokens": [50952, 645, 733, 295, 544, 3089, 21841, 370, 1310, 665, 457, 456, 311, 411, 456, 311, 746, 3291, 510, 466, 406, 51236], "temperature": 0.0, "avg_logprob": -0.10162187877454255, "compression_ratio": 1.8863636363636365, "no_speech_prob": 0.003944660536944866}, {"id": 786, "seek": 403444, "start": 4051.88, "end": 4057.48, "text": " being able to see the kind of standard sweet bench or similar or similar coding benchmark so we'll see", "tokens": [51236, 885, 1075, 281, 536, 264, 733, 295, 3832, 3844, 10638, 420, 2531, 420, 2531, 17720, 18927, 370, 321, 603, 536, 51516], "temperature": 0.0, "avg_logprob": -0.10162187877454255, "compression_ratio": 1.8863636363636365, "no_speech_prob": 0.003944660536944866}, {"id": 787, "seek": 403444, "start": 4057.48, "end": 4061.8, "text": " you know take all this with the grain of salt until we see independent validation of these these", "tokens": [51516, 291, 458, 747, 439, 341, 365, 264, 12837, 295, 5139, 1826, 321, 536, 6695, 24071, 295, 613, 613, 51732], "temperature": 0.0, "avg_logprob": -0.10162187877454255, "compression_ratio": 1.8863636363636365, "no_speech_prob": 0.003944660536944866}, {"id": 788, "seek": 406180, "start": 4061.8, "end": 4066.84, "text": " numbers think of them as preliminary but so far it seems pretty impressive just based on these numbers", "tokens": [50364, 3547, 519, 295, 552, 382, 28817, 457, 370, 1400, 309, 2544, 1238, 8992, 445, 2361, 322, 613, 3547, 50616], "temperature": 0.0, "avg_logprob": -0.15632079368413881, "compression_ratio": 1.6794871794871795, "no_speech_prob": 0.0032210061326622963}, {"id": 789, "seek": 406180, "start": 4067.6400000000003, "end": 4074.28, "text": " moving on to policy and safety a bit of a catch up story that we missed from the prior week", "tokens": [50656, 2684, 322, 281, 3897, 293, 4514, 257, 857, 295, 257, 3745, 493, 1657, 300, 321, 6721, 490, 264, 4059, 1243, 50988], "temperature": 0.0, "avg_logprob": -0.15632079368413881, "compression_ratio": 1.6794871794871795, "no_speech_prob": 0.0032210061326622963}, {"id": 790, "seek": 406180, "start": 4074.28, "end": 4080.6000000000004, "text": " a judge has blocked the pentagon's effort to punish on traffic by labeling it as a supply chain risk", "tokens": [50988, 257, 6995, 575, 15470, 264, 16834, 6709, 311, 4630, 281, 9842, 322, 6419, 538, 40244, 309, 382, 257, 5847, 5021, 3148, 51304], "temperature": 0.0, "avg_logprob": -0.15632079368413881, "compression_ratio": 1.6794871794871795, "no_speech_prob": 0.0032210061326622963}, {"id": 791, "seek": 406180, "start": 4080.6000000000004, "end": 4086.92, "text": " so a federal judge in California has indefinitely blocked this effort saying that it violated the", "tokens": [51304, 370, 257, 6019, 6995, 294, 5384, 575, 24162, 10925, 15470, 341, 4630, 1566, 300, 309, 33239, 264, 51620], "temperature": 0.0, "avg_logprob": -0.15632079368413881, "compression_ratio": 1.6794871794871795, "no_speech_prob": 0.0032210061326622963}, {"id": 792, "seek": 408692, "start": 4086.92, "end": 4093.32, "text": " company's first amendment right to do process so basically we covered this a couple episodes ago", "tokens": [50364, 2237, 311, 700, 17920, 558, 281, 360, 1399, 370, 1936, 321, 5343, 341, 257, 1916, 9313, 2057, 50684], "temperature": 0.0, "avg_logprob": -0.18230713187874137, "compression_ratio": 1.668122270742358, "no_speech_prob": 0.0007550045265816152}, {"id": 793, "seek": 408692, "start": 4093.32, "end": 4099.0, "text": " on traffic had a big fight with the pentagon after which they were labeled as supply chain risk and", "tokens": [50684, 322, 6419, 632, 257, 955, 2092, 365, 264, 16834, 6709, 934, 597, 436, 645, 21335, 382, 5847, 5021, 3148, 293, 50968], "temperature": 0.0, "avg_logprob": -0.18230713187874137, "compression_ratio": 1.668122270742358, "no_speech_prob": 0.0007550045265816152}, {"id": 794, "seek": 408692, "start": 4099.56, "end": 4105.8, "text": " the executive department basically told anyone affiliated with government and all of the federal", "tokens": [50996, 264, 10140, 5882, 1936, 1907, 2878, 42174, 365, 2463, 293, 439, 295, 264, 6019, 51308], "temperature": 0.0, "avg_logprob": -0.18230713187874137, "compression_ratio": 1.668122270742358, "no_speech_prob": 0.0007550045265816152}, {"id": 795, "seek": 408692, "start": 4105.8, "end": 4112.92, "text": " agencies to not work with on traffic here judge we tell lean ruled that that designation", "tokens": [51308, 9504, 281, 406, 589, 365, 322, 6419, 510, 6995, 321, 980, 11659, 20077, 300, 300, 40838, 51664], "temperature": 0.0, "avg_logprob": -0.18230713187874137, "compression_ratio": 1.668122270742358, "no_speech_prob": 0.0007550045265816152}, {"id": 796, "seek": 411292, "start": 4112.92, "end": 4119.32, "text": " with particular move to designate it as a supply chain risk was illegal retaliation for", "tokens": [50364, 365, 1729, 1286, 281, 1715, 473, 309, 382, 257, 5847, 5021, 3148, 390, 11905, 37924, 399, 337, 50684], "temperature": 0.0, "avg_logprob": -0.2105353923326128, "compression_ratio": 1.6622222222222223, "no_speech_prob": 0.0016226799925789237}, {"id": 797, "seek": 411292, "start": 4119.32, "end": 4125.88, "text": " on tropics public stats and essentially just being entirely on on tropic side in terms of", "tokens": [50684, 322, 9006, 1167, 1908, 18152, 293, 4476, 445, 885, 7696, 322, 322, 9006, 299, 1252, 294, 2115, 295, 51012], "temperature": 0.0, "avg_logprob": -0.2105353923326128, "compression_ratio": 1.6622222222222223, "no_speech_prob": 0.0016226799925789237}, {"id": 798, "seek": 411292, "start": 4125.88, "end": 4130.84, "text": " their argument in this matter yeah you don't you don't see judgments as scathing as this come out", "tokens": [51012, 641, 6770, 294, 341, 1871, 1338, 291, 500, 380, 291, 500, 380, 536, 40337, 382, 795, 267, 571, 382, 341, 808, 484, 51260], "temperature": 0.0, "avg_logprob": -0.2105353923326128, "compression_ratio": 1.6622222222222223, "no_speech_prob": 0.0016226799925789237}, {"id": 799, "seek": 411292, "start": 4130.84, "end": 4137.0, "text": " often and as listeners will know I mean I really do try and have tried maybe to a fault to kind of", "tokens": [51260, 2049, 293, 382, 23274, 486, 458, 286, 914, 286, 534, 360, 853, 293, 362, 3031, 1310, 281, 257, 7441, 281, 733, 295, 51568], "temperature": 0.0, "avg_logprob": -0.2105353923326128, "compression_ratio": 1.6622222222222223, "no_speech_prob": 0.0016226799925789237}, {"id": 800, "seek": 413700, "start": 4137.8, "end": 4144.04, "text": " see that the rationale in this administration's handling of some AI really issues this is one where", "tokens": [50404, 536, 300, 264, 41989, 294, 341, 7236, 311, 13175, 295, 512, 7318, 534, 2663, 341, 307, 472, 689, 50716], "temperature": 0.0, "avg_logprob": -0.13826557647350224, "compression_ratio": 1.6812227074235808, "no_speech_prob": 0.002322718733921647}, {"id": 801, "seek": 413700, "start": 4144.04, "end": 4149.48, "text": " I just have to say I don't see the logic I have never seen the logic this seems insane to me", "tokens": [50716, 286, 445, 362, 281, 584, 286, 500, 380, 536, 264, 9952, 286, 362, 1128, 1612, 264, 9952, 341, 2544, 10838, 281, 385, 50988], "temperature": 0.0, "avg_logprob": -0.13826557647350224, "compression_ratio": 1.6812227074235808, "no_speech_prob": 0.002322718733921647}, {"id": 802, "seek": 413700, "start": 4149.48, "end": 4155.08, "text": " but check out the language the judge is using she says nothing in the governing statute support", "tokens": [50988, 457, 1520, 484, 264, 2856, 264, 6995, 307, 1228, 750, 1619, 1825, 294, 264, 30054, 24774, 1406, 51268], "temperature": 0.0, "avg_logprob": -0.13826557647350224, "compression_ratio": 1.6812227074235808, "no_speech_prob": 0.002322718733921647}, {"id": 803, "seek": 413700, "start": 4155.08, "end": 4161.32, "text": " the or wellian notion that an American company may be branded a potential adversary and saboteur", "tokens": [51268, 264, 420, 731, 952, 10710, 300, 364, 2665, 2237, 815, 312, 38510, 257, 3995, 48222, 293, 5560, 1370, 374, 51580], "temperature": 0.0, "avg_logprob": -0.13826557647350224, "compression_ratio": 1.6812227074235808, "no_speech_prob": 0.002322718733921647}, {"id": 804, "seek": 416132, "start": 4161.4, "end": 4167.0, "text": " of the US for expressing disagreement with the government basically you can't just like", "tokens": [50368, 295, 264, 2546, 337, 22171, 38947, 365, 264, 2463, 1936, 291, 393, 380, 445, 411, 50648], "temperature": 0.0, "avg_logprob": -0.11124854726889699, "compression_ratio": 1.806083650190114, "no_speech_prob": 0.0023965921718627214}, {"id": 805, "seek": 416132, "start": 4167.0, "end": 4172.5199999999995, "text": " call them a supply chain risk which is a status that's reserved for companies like Huawei like", "tokens": [50648, 818, 552, 257, 5847, 5021, 3148, 597, 307, 257, 6558, 300, 311, 24819, 337, 3431, 411, 28542, 411, 50924], "temperature": 0.0, "avg_logprob": -0.11124854726889699, "compression_ratio": 1.806083650190114, "no_speech_prob": 0.0023965921718627214}, {"id": 806, "seek": 416132, "start": 4172.5199999999995, "end": 4177.24, "text": " American companies just don't get this designation just because you express disagreement with the", "tokens": [50924, 2665, 3431, 445, 500, 380, 483, 341, 40838, 445, 570, 291, 5109, 38947, 365, 264, 51160], "temperature": 0.0, "avg_logprob": -0.11124854726889699, "compression_ratio": 1.806083650190114, "no_speech_prob": 0.0023965921718627214}, {"id": 807, "seek": 416132, "start": 4177.24, "end": 4181.88, "text": " government like that is insane she feels quite directly that the DOD's own records show it labeled", "tokens": [51160, 2463, 411, 300, 307, 10838, 750, 3417, 1596, 3838, 300, 264, 10699, 35, 311, 1065, 7724, 855, 309, 21335, 51392], "temperature": 0.0, "avg_logprob": -0.11124854726889699, "compression_ratio": 1.806083650190114, "no_speech_prob": 0.0023965921718627214}, {"id": 808, "seek": 416132, "start": 4181.88, "end": 4186.92, "text": " in tropic as supply chain risk because of its quotes hostile manner through the press which you", "tokens": [51392, 294, 9006, 299, 382, 5847, 5021, 3148, 570, 295, 1080, 19963, 27312, 9060, 807, 264, 1886, 597, 291, 51644], "temperature": 0.0, "avg_logprob": -0.11124854726889699, "compression_ratio": 1.806083650190114, "no_speech_prob": 0.0023965921718627214}, {"id": 809, "seek": 418692, "start": 4186.92, "end": 4191.4800000000005, "text": " know if you're following at home that is not a reason to label a company a supply chain risk even", "tokens": [50364, 458, 498, 291, 434, 3480, 412, 1280, 300, 307, 406, 257, 1778, 281, 7645, 257, 2237, 257, 5847, 5021, 3148, 754, 50592], "temperature": 0.0, "avg_logprob": -0.11348564284188407, "compression_ratio": 1.8153846153846154, "no_speech_prob": 0.00051122932927683}, {"id": 810, "seek": 418692, "start": 4191.4800000000005, "end": 4195.64, "text": " if it were true it's also important no like this is a there's a circling of the wagon thing", "tokens": [50592, 498, 309, 645, 2074, 309, 311, 611, 1021, 572, 411, 341, 307, 257, 456, 311, 257, 3510, 1688, 295, 264, 34453, 551, 50800], "temperature": 0.0, "avg_logprob": -0.11348564284188407, "compression_ratio": 1.8153846153846154, "no_speech_prob": 0.00051122932927683}, {"id": 811, "seek": 418692, "start": 4195.64, "end": 4199.64, "text": " happening kind of right it's a preview of a conflict right we're going to be seeing this playout", "tokens": [50800, 2737, 733, 295, 558, 309, 311, 257, 14281, 295, 257, 6596, 558, 321, 434, 516, 281, 312, 2577, 341, 862, 346, 51000], "temperature": 0.0, "avg_logprob": -0.11348564284188407, "compression_ratio": 1.8153846153846154, "no_speech_prob": 0.00051122932927683}, {"id": 812, "seek": 418692, "start": 4199.64, "end": 4204.2, "text": " over and over again who gets to set the ethical guardrails on AI systems right is it going to be the", "tokens": [51000, 670, 293, 670, 797, 567, 2170, 281, 992, 264, 18890, 6290, 424, 4174, 322, 7318, 3652, 558, 307, 309, 516, 281, 312, 264, 51228], "temperature": 0.0, "avg_logprob": -0.11348564284188407, "compression_ratio": 1.8153846153846154, "no_speech_prob": 0.00051122932927683}, {"id": 813, "seek": 418692, "start": 4204.2, "end": 4209.8, "text": " companies or the government and right now the pentagon's position is well you know what like we can't", "tokens": [51228, 3431, 420, 264, 2463, 293, 558, 586, 264, 16834, 6709, 311, 2535, 307, 731, 291, 458, 437, 411, 321, 393, 380, 51508], "temperature": 0.0, "avg_logprob": -0.11348564284188407, "compression_ratio": 1.8153846153846154, "no_speech_prob": 0.00051122932927683}, {"id": 814, "seek": 418692, "start": 4209.8, "end": 4215.4800000000005, "text": " allow AI companies to bake in their policy preferences into these models and like pollute the supply", "tokens": [51508, 2089, 7318, 3431, 281, 16562, 294, 641, 3897, 21910, 666, 613, 5245, 293, 411, 6418, 1169, 264, 5847, 51792], "temperature": 0.0, "avg_logprob": -0.11348564284188407, "compression_ratio": 1.8153846153846154, "no_speech_prob": 0.00051122932927683}, {"id": 815, "seek": 421548, "start": 4215.48, "end": 4220.12, "text": " chain basically because then warfighters get ineffective weapons and tropics counter courses that", "tokens": [50364, 5021, 1936, 570, 550, 1516, 14919, 433, 483, 48836, 7278, 293, 9006, 1167, 5682, 7712, 300, 50596], "temperature": 0.0, "avg_logprob": -0.14726791741713038, "compression_ratio": 1.7038327526132404, "no_speech_prob": 0.005553410388529301}, {"id": 816, "seek": 421548, "start": 4220.12, "end": 4225.16, "text": " hey with their safety commit commitments are protected speech they see as a first amendment issue", "tokens": [50596, 4177, 365, 641, 4514, 5599, 26230, 366, 10594, 6218, 436, 536, 382, 257, 700, 17920, 2734, 50848], "temperature": 0.0, "avg_logprob": -0.14726791741713038, "compression_ratio": 1.7038327526132404, "no_speech_prob": 0.005553410388529301}, {"id": 817, "seek": 421548, "start": 4225.16, "end": 4230.5199999999995, "text": " it's not a matter of defective products it's just this is free speech so kind of interesting by the", "tokens": [50848, 309, 311, 406, 257, 1871, 295, 16445, 488, 3383, 309, 311, 445, 341, 307, 1737, 6218, 370, 733, 295, 1880, 538, 264, 51116], "temperature": 0.0, "avg_logprob": -0.14726791741713038, "compression_ratio": 1.7038327526132404, "no_speech_prob": 0.005553410388529301}, {"id": 818, "seek": 421548, "start": 4230.5199999999995, "end": 4235.08, "text": " way next steps this is where I was getting confused frankly so I did a bit of a dive to understand", "tokens": [51116, 636, 958, 4439, 341, 307, 689, 286, 390, 1242, 9019, 11939, 370, 286, 630, 257, 857, 295, 257, 9192, 281, 1223, 51344], "temperature": 0.0, "avg_logprob": -0.14726791741713038, "compression_ratio": 1.7038327526132404, "no_speech_prob": 0.005553410388529301}, {"id": 819, "seek": 421548, "start": 4235.08, "end": 4240.44, "text": " like what's next what what happens now so the department will work file that's appeal on April", "tokens": [51344, 411, 437, 311, 958, 437, 437, 2314, 586, 370, 264, 5882, 486, 589, 3991, 300, 311, 13668, 322, 6929, 51612], "temperature": 0.0, "avg_logprob": -0.14726791741713038, "compression_ratio": 1.7038327526132404, "no_speech_prob": 0.005553410388529301}, {"id": 820, "seek": 424044, "start": 4240.44, "end": 4244.759999999999, "text": " second challenging this ruling so they're not taking this on the chain necessarily they're", "tokens": [50364, 1150, 7595, 341, 21437, 370, 436, 434, 406, 1940, 341, 322, 264, 5021, 4725, 436, 434, 50580], "temperature": 0.0, "avg_logprob": -0.1841117356174676, "compression_ratio": 1.8949152542372882, "no_speech_prob": 0.011864811182022095}, {"id": 821, "seek": 424044, "start": 4244.759999999999, "end": 4248.36, "text": " or say they are taking a chain they're saying okay we're going to appeal this and this is", "tokens": [50580, 420, 584, 436, 366, 1940, 257, 5021, 436, 434, 1566, 1392, 321, 434, 516, 281, 13668, 341, 293, 341, 307, 50760], "temperature": 0.0, "avg_logprob": -0.1841117356174676, "compression_ratio": 1.8949152542372882, "no_speech_prob": 0.011864811182022095}, {"id": 822, "seek": 424044, "start": 4248.36, "end": 4254.679999999999, "text": " that's ruling just by way is a preliminary injunction so it's sort of like pauses everything", "tokens": [50760, 300, 311, 21437, 445, 538, 636, 307, 257, 28817, 5580, 32627, 370, 309, 311, 1333, 295, 411, 2502, 8355, 1203, 51076], "temperature": 0.0, "avg_logprob": -0.1841117356174676, "compression_ratio": 1.8949152542372882, "no_speech_prob": 0.011864811182022095}, {"id": 823, "seek": 424044, "start": 4255.08, "end": 4260.04, "text": " according to this judges ruling and now there's going to be more back and forth with regards to", "tokens": [51096, 4650, 281, 341, 14449, 21437, 293, 586, 456, 311, 516, 281, 312, 544, 646, 293, 5220, 365, 14258, 281, 51344], "temperature": 0.0, "avg_logprob": -0.1841117356174676, "compression_ratio": 1.8949152542372882, "no_speech_prob": 0.011864811182022095}, {"id": 824, "seek": 424044, "start": 4260.04, "end": 4264.28, "text": " what's the judge said in this matter from what I understand yes actually and that's a really", "tokens": [51344, 437, 311, 264, 6995, 848, 294, 341, 1871, 490, 437, 286, 1223, 2086, 767, 293, 300, 311, 257, 534, 51556], "temperature": 0.0, "avg_logprob": -0.1841117356174676, "compression_ratio": 1.8949152542372882, "no_speech_prob": 0.011864811182022095}, {"id": 825, "seek": 424044, "start": 4264.28, "end": 4268.44, "text": " important point and injunction is when a court steps in and says whoa hold on don't do the thing", "tokens": [51556, 1021, 935, 293, 5580, 32627, 307, 562, 257, 4753, 4439, 294, 293, 1619, 13310, 1797, 322, 500, 380, 360, 264, 551, 51764], "temperature": 0.0, "avg_logprob": -0.1841117356174676, "compression_ratio": 1.8949152542372882, "no_speech_prob": 0.011864811182022095}, {"id": 826, "seek": 426844, "start": 4268.44, "end": 4273.0, "text": " that you're about to do it's a court saying preliminary like whoa you might cause irreversible", "tokens": [50364, 300, 291, 434, 466, 281, 360, 309, 311, 257, 4753, 1566, 28817, 411, 13310, 291, 1062, 3082, 16014, 840, 964, 50592], "temperature": 0.0, "avg_logprob": -0.08848235481663753, "compression_ratio": 1.9019607843137254, "no_speech_prob": 0.002472046995535493}, {"id": 827, "seek": 426844, "start": 4273.0, "end": 4278.04, "text": " damage if you do that thing so we're going to otherwise we would not be like courts don't love to", "tokens": [50592, 4344, 498, 291, 360, 300, 551, 370, 321, 434, 516, 281, 5911, 321, 576, 406, 312, 411, 14141, 500, 380, 959, 281, 50844], "temperature": 0.0, "avg_logprob": -0.08848235481663753, "compression_ratio": 1.9019607843137254, "no_speech_prob": 0.002472046995535493}, {"id": 828, "seek": 426844, "start": 4278.04, "end": 4282.5199999999995, "text": " do that right because it's sort of undermines it doesn't undermine due process quite but it gets", "tokens": [50844, 360, 300, 558, 570, 309, 311, 1333, 295, 24188, 1652, 309, 1177, 380, 39257, 3462, 1399, 1596, 457, 309, 2170, 51068], "temperature": 0.0, "avg_logprob": -0.08848235481663753, "compression_ratio": 1.9019607843137254, "no_speech_prob": 0.002472046995535493}, {"id": 829, "seek": 426844, "start": 4282.5199999999995, "end": 4287.32, "text": " ahead of what otherwise would be a longer or more thoughtful process and so you don't tend to see", "tokens": [51068, 2286, 295, 437, 5911, 576, 312, 257, 2854, 420, 544, 21566, 1399, 293, 370, 291, 500, 380, 3928, 281, 536, 51308], "temperature": 0.0, "avg_logprob": -0.08848235481663753, "compression_ratio": 1.9019607843137254, "no_speech_prob": 0.002472046995535493}, {"id": 830, "seek": 426844, "start": 4287.32, "end": 4291.4, "text": " these things granted the fact that this was granted is pretty damning of the government's position", "tokens": [51308, 613, 721, 12344, 264, 1186, 300, 341, 390, 12344, 307, 1238, 2422, 773, 295, 264, 2463, 311, 2535, 51512], "temperature": 0.0, "avg_logprob": -0.08848235481663753, "compression_ratio": 1.9019607843137254, "no_speech_prob": 0.002472046995535493}, {"id": 831, "seek": 426844, "start": 4291.4, "end": 4296.36, "text": " here and so this was though appealed by the government they moved within days of the injunction", "tokens": [51512, 510, 293, 370, 341, 390, 1673, 2363, 5573, 538, 264, 2463, 436, 4259, 1951, 1708, 295, 264, 5580, 32627, 51760], "temperature": 0.0, "avg_logprob": -0.08848235481663753, "compression_ratio": 1.9019607843137254, "no_speech_prob": 0.002472046995535493}, {"id": 832, "seek": 429636, "start": 4296.36, "end": 4301.96, "text": " taking effect to fight back and it's now there's kind of like two parallel cases happening", "tokens": [50364, 1940, 1802, 281, 2092, 646, 293, 309, 311, 586, 456, 311, 733, 295, 411, 732, 8952, 3331, 2737, 50644], "temperature": 0.0, "avg_logprob": -0.12108309128705193, "compression_ratio": 1.7527272727272727, "no_speech_prob": 0.0023229382932186127}, {"id": 833, "seek": 429636, "start": 4301.96, "end": 4306.839999999999, "text": " so anthropic it filed two separate lawsuits a general one in the northern district of California", "tokens": [50644, 370, 22727, 299, 309, 18789, 732, 4994, 39493, 257, 2674, 472, 294, 264, 14197, 6566, 295, 5384, 50888], "temperature": 0.0, "avg_logprob": -0.12108309128705193, "compression_ratio": 1.7527272727272727, "no_speech_prob": 0.0023229382932186127}, {"id": 834, "seek": 429636, "start": 4306.839999999999, "end": 4311.639999999999, "text": " this is the one that judge Lynn passed judgment on here and there there's a potential appeal to the", "tokens": [50888, 341, 307, 264, 472, 300, 6995, 27469, 4678, 12216, 322, 510, 293, 456, 456, 311, 257, 3995, 13668, 281, 264, 51128], "temperature": 0.0, "avg_logprob": -0.12108309128705193, "compression_ratio": 1.7527272727272727, "no_speech_prob": 0.0023229382932186127}, {"id": 835, "seek": 429636, "start": 4311.639999999999, "end": 4316.679999999999, "text": " ninth circuit and the Pentagon is asking the appeals court to lift her or to pause the injunction", "tokens": [51128, 28207, 9048, 293, 264, 36371, 307, 3365, 264, 32603, 4753, 281, 5533, 720, 420, 281, 10465, 264, 5580, 32627, 51380], "temperature": 0.0, "avg_logprob": -0.12108309128705193, "compression_ratio": 1.7527272727272727, "no_speech_prob": 0.0023229382932186127}, {"id": 836, "seek": 429636, "start": 4316.679999999999, "end": 4322.12, "text": " while the case continues and the ninth circuit court could rule pretty quickly on that basically", "tokens": [51380, 1339, 264, 1389, 6515, 293, 264, 28207, 9048, 4753, 727, 4978, 1238, 2661, 322, 300, 1936, 51652], "temperature": 0.0, "avg_logprob": -0.12108309128705193, "compression_ratio": 1.7527272727272727, "no_speech_prob": 0.0023229382932186127}, {"id": 837, "seek": 432212, "start": 4322.12, "end": 4326.12, "text": " the emergency request because they've got to decide quickly whether they're going to take up like", "tokens": [50364, 264, 7473, 5308, 570, 436, 600, 658, 281, 4536, 2661, 1968, 436, 434, 516, 281, 747, 493, 411, 50564], "temperature": 0.0, "avg_logprob": -0.1311788558959961, "compression_ratio": 1.7421602787456445, "no_speech_prob": 0.001366905984468758}, {"id": 838, "seek": 432212, "start": 4326.12, "end": 4331.24, "text": " rip out all the anthropic stuff from the D.O.W. and then there's going to be a full trial in California", "tokens": [50564, 12782, 484, 439, 264, 22727, 299, 1507, 490, 264, 413, 13, 46, 13, 54, 13, 293, 550, 456, 311, 516, 281, 312, 257, 1577, 7308, 294, 5384, 50820], "temperature": 0.0, "avg_logprob": -0.1311788558959961, "compression_ratio": 1.7421602787456445, "no_speech_prob": 0.001366905984468758}, {"id": 839, "seek": 432212, "start": 4331.24, "end": 4336.2, "text": " that'll play out after the preliminary hold is done so basically idea here is just to like pause", "tokens": [50820, 300, 603, 862, 484, 934, 264, 28817, 1797, 307, 1096, 370, 1936, 1558, 510, 307, 445, 281, 411, 10465, 51068], "temperature": 0.0, "avg_logprob": -0.1311788558959961, "compression_ratio": 1.7421602787456445, "no_speech_prob": 0.001366905984468758}, {"id": 840, "seek": 432212, "start": 4336.2, "end": 4340.68, "text": " the government's ban until the court can decide on the merits of the main case and then there's the", "tokens": [51068, 264, 2463, 311, 5643, 1826, 264, 4753, 393, 4536, 322, 264, 40923, 295, 264, 2135, 1389, 293, 550, 456, 311, 264, 51292], "temperature": 0.0, "avg_logprob": -0.1311788558959961, "compression_ratio": 1.7421602787456445, "no_speech_prob": 0.001366905984468758}, {"id": 841, "seek": 432212, "start": 4340.68, "end": 4345.88, "text": " DC circuit court which is specifically challenging the designation of supply chain risk under a whole", "tokens": [51292, 9114, 9048, 4753, 597, 307, 4682, 7595, 264, 40838, 295, 5847, 5021, 3148, 833, 257, 1379, 51552], "temperature": 0.0, "avg_logprob": -0.1311788558959961, "compression_ratio": 1.7421602787456445, "no_speech_prob": 0.001366905984468758}, {"id": 842, "seek": 434588, "start": 4345.88, "end": 4351.4800000000005, "text": " separate legal argument this all could it could escalate either one of these could lead to a", "tokens": [50364, 4994, 5089, 6770, 341, 439, 727, 309, 727, 17871, 473, 2139, 472, 295, 613, 727, 1477, 281, 257, 50644], "temperature": 0.0, "avg_logprob": -0.10251564925976014, "compression_ratio": 1.645021645021645, "no_speech_prob": 0.008060879074037075}, {"id": 843, "seek": 434588, "start": 4351.4800000000005, "end": 4356.52, "text": " supreme court case if it successfully gets appealed I'm not a lawyer my guess is this will not", "tokens": [50644, 27756, 4753, 1389, 498, 309, 10727, 2170, 2363, 5573, 286, 478, 406, 257, 11613, 452, 2041, 307, 341, 486, 406, 50896], "temperature": 0.0, "avg_logprob": -0.10251564925976014, "compression_ratio": 1.645021645021645, "no_speech_prob": 0.008060879074037075}, {"id": 844, "seek": 434588, "start": 4356.52, "end": 4363.0, "text": " get appealed successfully just because this is such a scathing judgment by the judge I like 43", "tokens": [50896, 483, 2363, 5573, 10727, 445, 570, 341, 307, 1270, 257, 795, 267, 571, 12216, 538, 264, 6995, 286, 411, 17914, 51220], "temperature": 0.0, "avg_logprob": -0.10251564925976014, "compression_ratio": 1.645021645021645, "no_speech_prob": 0.008060879074037075}, {"id": 845, "seek": 434588, "start": 4363.0, "end": 4371.08, "text": " page PDF you can read and it yeah like it's it's detailed and very clear about it being basically", "tokens": [51220, 3028, 17752, 291, 393, 1401, 293, 309, 1338, 411, 309, 311, 309, 311, 9942, 293, 588, 1850, 466, 309, 885, 1936, 51624], "temperature": 0.0, "avg_logprob": -0.10251564925976014, "compression_ratio": 1.645021645021645, "no_speech_prob": 0.008060879074037075}, {"id": 846, "seek": 437108, "start": 4371.32, "end": 4377.4, "text": " nonsense move legally yeah yeah exactly so who knows anything can happen in a courtroom but", "tokens": [50376, 14925, 1286, 21106, 1338, 1338, 2293, 370, 567, 3255, 1340, 393, 1051, 294, 257, 44050, 457, 50680], "temperature": 0.0, "avg_logprob": -0.19759741696444424, "compression_ratio": 1.6990740740740742, "no_speech_prob": 0.03111177310347557}, {"id": 847, "seek": 437108, "start": 4377.4, "end": 4381.88, "text": " man it's like does not look like a good spot for them to be in and potentially I don't know if", "tokens": [50680, 587, 309, 311, 411, 775, 406, 574, 411, 257, 665, 4008, 337, 552, 281, 312, 294, 293, 7263, 286, 500, 380, 458, 498, 50904], "temperature": 0.0, "avg_logprob": -0.19759741696444424, "compression_ratio": 1.6990740740740742, "no_speech_prob": 0.03111177310347557}, {"id": 848, "seek": 437108, "start": 4381.88, "end": 4386.84, "text": " damages are on the table but if they are it could be I mean it would have to billions billions and", "tokens": [50904, 28536, 366, 322, 264, 3199, 457, 498, 436, 366, 309, 727, 312, 286, 914, 309, 576, 362, 281, 17375, 17375, 293, 51152], "temperature": 0.0, "avg_logprob": -0.19759741696444424, "compression_ratio": 1.6990740740740742, "no_speech_prob": 0.03111177310347557}, {"id": 849, "seek": 437108, "start": 4386.84, "end": 4393.5599999999995, "text": " billions and another story this time on the safety front non-deposy front from an", "tokens": [51152, 17375, 293, 1071, 1657, 341, 565, 322, 264, 4514, 1868, 2107, 12, 19929, 329, 88, 1868, 490, 364, 51488], "temperature": 0.0, "avg_logprob": -0.19759741696444424, "compression_ratio": 1.6990740740740742, "no_speech_prob": 0.03111177310347557}, {"id": 850, "seek": 439356, "start": 4393.56, "end": 4399.64, "text": " frot pick they released emotion concepts and they have function in large language models so this", "tokens": [50364, 431, 310, 1888, 436, 4736, 8913, 10392, 293, 436, 362, 2445, 294, 2416, 2856, 5245, 370, 341, 50668], "temperature": 0.0, "avg_logprob": -0.2608790831132369, "compression_ratio": 1.6623931623931625, "no_speech_prob": 0.023287693038582802}, {"id": 851, "seek": 439356, "start": 4399.64, "end": 4408.200000000001, "text": " is one of these like pretty deep beefy interprability slash safety search papers from on frot pick they", "tokens": [50668, 307, 472, 295, 613, 411, 1238, 2452, 9256, 88, 728, 79, 5305, 1140, 17330, 4514, 3164, 10577, 490, 322, 431, 310, 1888, 436, 51096], "temperature": 0.0, "avg_logprob": -0.2608790831132369, "compression_ratio": 1.6623931623931625, "no_speech_prob": 0.023287693038582802}, {"id": 852, "seek": 439356, "start": 4408.200000000001, "end": 4415.160000000001, "text": " look within sign at 4.5 we already know that there are these vectors that can be associated with", "tokens": [51096, 574, 1951, 1465, 412, 1017, 13, 20, 321, 1217, 458, 300, 456, 366, 613, 18875, 300, 393, 312, 6615, 365, 51444], "temperature": 0.0, "avg_logprob": -0.2608790831132369, "compression_ratio": 1.6623931623931625, "no_speech_prob": 0.023287693038582802}, {"id": 853, "seek": 439356, "start": 4415.160000000001, "end": 4421.320000000001, "text": " specific features so you know there's a sad vector is a happy vector etc and basically they", "tokens": [51444, 2685, 4122, 370, 291, 458, 456, 311, 257, 4227, 8062, 307, 257, 2055, 8062, 5183, 293, 1936, 436, 51752], "temperature": 0.0, "avg_logprob": -0.2608790831132369, "compression_ratio": 1.6623931623931625, "no_speech_prob": 0.023287693038582802}, {"id": 854, "seek": 442132, "start": 4421.32, "end": 4428.92, "text": " investigated what role do these vectors play in terms of model you know I guess characteristics", "tokens": [50364, 30070, 437, 3090, 360, 613, 18875, 862, 294, 2115, 295, 2316, 291, 458, 286, 2041, 10891, 50744], "temperature": 0.0, "avg_logprob": -0.11771281957626342, "compression_ratio": 1.7488372093023257, "no_speech_prob": 0.003171440912410617}, {"id": 855, "seek": 442132, "start": 4428.92, "end": 4437.16, "text": " or functioning and in a way it's sort of is what you might expect at least that was my reading is", "tokens": [50744, 420, 18483, 293, 294, 257, 636, 309, 311, 1333, 295, 307, 437, 291, 1062, 2066, 412, 1935, 300, 390, 452, 3760, 307, 51156], "temperature": 0.0, "avg_logprob": -0.11771281957626342, "compression_ratio": 1.7488372093023257, "no_speech_prob": 0.003171440912410617}, {"id": 856, "seek": 442132, "start": 4437.16, "end": 4444.12, "text": " you know the models use these vectors or activate these vectors in the semantically appropriate", "tokens": [51156, 291, 458, 264, 5245, 764, 613, 18875, 420, 13615, 613, 18875, 294, 264, 4361, 49505, 6854, 51504], "temperature": 0.0, "avg_logprob": -0.11771281957626342, "compression_ratio": 1.7488372093023257, "no_speech_prob": 0.003171440912410617}, {"id": 857, "seek": 442132, "start": 4444.12, "end": 4451.0, "text": " context so if the model is failing at something it'll get more frustrated if the model", "tokens": [51504, 4319, 370, 498, 264, 2316, 307, 18223, 412, 746, 309, 603, 483, 544, 15751, 498, 264, 2316, 51848], "temperature": 0.0, "avg_logprob": -0.11771281957626342, "compression_ratio": 1.7488372093023257, "no_speech_prob": 0.003171440912410617}, {"id": 858, "seek": 445132, "start": 4451.4, "end": 4458.12, "text": " is talking to you about you know some good memory or trying to uplift you or whatever it will", "tokens": [50368, 307, 1417, 281, 291, 466, 291, 458, 512, 665, 4675, 420, 1382, 281, 45407, 291, 420, 2035, 309, 486, 50704], "temperature": 0.0, "avg_logprob": -0.11393981888180688, "compression_ratio": 1.722466960352423, "no_speech_prob": 0.004607197362929583}, {"id": 859, "seek": 445132, "start": 4458.12, "end": 4465.96, "text": " have these happy vectors so there's also a philosophical angle with a note unlike is it fake that", "tokens": [50704, 362, 613, 2055, 18875, 370, 456, 311, 611, 257, 25066, 5802, 365, 257, 3637, 8343, 307, 309, 7592, 300, 51096], "temperature": 0.0, "avg_logprob": -0.11393981888180688, "compression_ratio": 1.722466960352423, "no_speech_prob": 0.004607197362929583}, {"id": 860, "seek": 445132, "start": 4465.96, "end": 4472.759999999999, "text": " there are emotions inside this are they faking it or of these like real indications that is another", "tokens": [51096, 456, 366, 8462, 1854, 341, 366, 436, 283, 2456, 309, 420, 295, 613, 411, 957, 44450, 300, 307, 1071, 51436], "temperature": 0.0, "avg_logprob": -0.11393981888180688, "compression_ratio": 1.722466960352423, "no_speech_prob": 0.004607197362929583}, {"id": 861, "seek": 445132, "start": 4472.759999999999, "end": 4478.5199999999995, "text": " consideration from like a model welfare standpoint which on frot pick controversially still kind of", "tokens": [51436, 12381, 490, 411, 257, 2316, 17788, 15827, 597, 322, 431, 310, 1888, 11542, 2270, 920, 733, 295, 51724], "temperature": 0.0, "avg_logprob": -0.11393981888180688, "compression_ratio": 1.722466960352423, "no_speech_prob": 0.004607197362929583}, {"id": 862, "seek": 447852, "start": 4478.52, "end": 4485.0, "text": " talks about model welfare and potential consciousness it's worth noting that there are notions of", "tokens": [50364, 6686, 466, 2316, 17788, 293, 3995, 10081, 309, 311, 3163, 26801, 300, 456, 366, 35799, 295, 50688], "temperature": 0.0, "avg_logprob": -0.22635672642634466, "compression_ratio": 1.7326388888888888, "no_speech_prob": 0.0025105823297053576}, {"id": 863, "seek": 447852, "start": 4485.0, "end": 4492.200000000001, "text": " emotions by the vendors models that are activated at reasonable kind of semantically predictable points", "tokens": [50688, 8462, 538, 264, 22056, 5245, 300, 366, 18157, 412, 10585, 733, 295, 4361, 49505, 27737, 2793, 51048], "temperature": 0.0, "avg_logprob": -0.22635672642634466, "compression_ratio": 1.7326388888888888, "no_speech_prob": 0.0025105823297053576}, {"id": 864, "seek": 447852, "start": 4492.200000000001, "end": 4497.320000000001, "text": " of view all right so I'm jumping in in Andres Wake you're talking about emotion concepts in their", "tokens": [51048, 295, 1910, 439, 558, 370, 286, 478, 11233, 294, 294, 400, 495, 21062, 291, 434, 1417, 466, 8913, 10392, 294, 641, 51304], "temperature": 0.0, "avg_logprob": -0.22635672642634466, "compression_ratio": 1.7326388888888888, "no_speech_prob": 0.0025105823297053576}, {"id": 865, "seek": 447852, "start": 4497.320000000001, "end": 4502.280000000001, "text": " function in a large language model this paper got a lot of attention and under is right like the", "tokens": [51304, 2445, 294, 257, 2416, 2856, 2316, 341, 3035, 658, 257, 688, 295, 3202, 293, 833, 307, 558, 411, 264, 51552], "temperature": 0.0, "avg_logprob": -0.22635672642634466, "compression_ratio": 1.7326388888888888, "no_speech_prob": 0.0025105823297053576}, {"id": 866, "seek": 447852, "start": 4503.080000000001, "end": 4507.96, "text": " the core idea here is is fairly simple but there's some some nuance to it that is quite interesting so", "tokens": [51592, 264, 4965, 1558, 510, 307, 307, 6457, 2199, 457, 456, 311, 512, 512, 42625, 281, 309, 300, 307, 1596, 1880, 370, 51836], "temperature": 0.0, "avg_logprob": -0.22635672642634466, "compression_ratio": 1.7326388888888888, "no_speech_prob": 0.0025105823297053576}, {"id": 867, "seek": 450852, "start": 4508.52, "end": 4516.6, "text": " broadly the idea here is when you get language models to read text that contains some emotional", "tokens": [50364, 19511, 264, 1558, 510, 307, 562, 291, 483, 2856, 5245, 281, 1401, 2487, 300, 8306, 512, 6863, 50768], "temperature": 0.0, "avg_logprob": -0.07352491866710574, "compression_ratio": 1.8101851851851851, "no_speech_prob": 0.0004511572769843042}, {"id": 868, "seek": 450852, "start": 4516.6, "end": 4522.040000000001, "text": " value right so think about you know stressful text or or happy text or whatever you will tend to see", "tokens": [50768, 2158, 558, 370, 519, 466, 291, 458, 19108, 2487, 420, 420, 2055, 2487, 420, 2035, 291, 486, 3928, 281, 536, 51040], "temperature": 0.0, "avg_logprob": -0.07352491866710574, "compression_ratio": 1.8101851851851851, "no_speech_prob": 0.0004511572769843042}, {"id": 869, "seek": 450852, "start": 4522.040000000001, "end": 4529.400000000001, "text": " a consistent pattern of activations that fire in the model that map to those emotions so you can", "tokens": [51040, 257, 8398, 5102, 295, 2430, 763, 300, 2610, 294, 264, 2316, 300, 4471, 281, 729, 8462, 370, 291, 393, 51408], "temperature": 0.0, "avg_logprob": -0.07352491866710574, "compression_ratio": 1.8101851851851851, "no_speech_prob": 0.0004511572769843042}, {"id": 870, "seek": 450852, "start": 4529.400000000001, "end": 4534.360000000001, "text": " actually like train models to recognize ah like that is you know that is the happy or that is the", "tokens": [51408, 767, 411, 3847, 5245, 281, 5521, 3716, 411, 300, 307, 291, 458, 300, 307, 264, 2055, 420, 300, 307, 264, 51656], "temperature": 0.0, "avg_logprob": -0.07352491866710574, "compression_ratio": 1.8101851851851851, "no_speech_prob": 0.0004511572769843042}, {"id": 871, "seek": 453436, "start": 4534.36, "end": 4539.639999999999, "text": " brooding or whatever emotion that's being picked up on by the model so far so good right and you", "tokens": [50364, 2006, 8616, 420, 2035, 8913, 300, 311, 885, 6183, 493, 322, 538, 264, 2316, 370, 1400, 370, 665, 558, 293, 291, 50628], "temperature": 0.0, "avg_logprob": -0.06312718855596222, "compression_ratio": 1.8125, "no_speech_prob": 0.008575092069804668}, {"id": 872, "seek": 453436, "start": 4539.639999999999, "end": 4543.799999999999, "text": " could do you know use a sparse auto encoder or something to detect those that's not how they do it", "tokens": [50628, 727, 360, 291, 458, 764, 257, 637, 11668, 8399, 2058, 19866, 420, 746, 281, 5531, 729, 300, 311, 406, 577, 436, 360, 309, 50836], "temperature": 0.0, "avg_logprob": -0.06312718855596222, "compression_ratio": 1.8125, "no_speech_prob": 0.008575092069804668}, {"id": 873, "seek": 453436, "start": 4543.799999999999, "end": 4549.5599999999995, "text": " here they actually use a simpler method where they basically say like show me just the activations", "tokens": [50836, 510, 436, 767, 764, 257, 18587, 3170, 689, 436, 1936, 584, 411, 855, 385, 445, 264, 2430, 763, 51124], "temperature": 0.0, "avg_logprob": -0.06312718855596222, "compression_ratio": 1.8125, "no_speech_prob": 0.008575092069804668}, {"id": 874, "seek": 453436, "start": 4549.5599999999995, "end": 4556.04, "text": " that are associated with this text and then I'm going to sort of subtract off the sort of average", "tokens": [51124, 300, 366, 6615, 365, 341, 2487, 293, 550, 286, 478, 516, 281, 1333, 295, 16390, 766, 264, 1333, 295, 4274, 51448], "temperature": 0.0, "avg_logprob": -0.06312718855596222, "compression_ratio": 1.8125, "no_speech_prob": 0.008575092069804668}, {"id": 875, "seek": 453436, "start": 4556.04, "end": 4561.5599999999995, "text": " activations across a whole bunch of text and that difference is going to tell me about the emotional", "tokens": [51448, 2430, 763, 2108, 257, 1379, 3840, 295, 2487, 293, 300, 2649, 307, 516, 281, 980, 385, 466, 264, 6863, 51724], "temperature": 0.0, "avg_logprob": -0.06312718855596222, "compression_ratio": 1.8125, "no_speech_prob": 0.008575092069804668}, {"id": 876, "seek": 456156, "start": 4561.64, "end": 4567.240000000001, "text": " value of that piece of text so it's kind of a it's called contrastive activation extraction", "tokens": [50368, 2158, 295, 300, 2522, 295, 2487, 370, 309, 311, 733, 295, 257, 309, 311, 1219, 8712, 488, 24433, 30197, 50648], "temperature": 0.0, "avg_logprob": -0.07533613586425782, "compression_ratio": 1.9657534246575343, "no_speech_prob": 0.0007553428295068443}, {"id": 877, "seek": 456156, "start": 4567.240000000001, "end": 4571.320000000001, "text": " basically it's kind of like linear probing you're you're just looking at what is the difference", "tokens": [50648, 1936, 309, 311, 733, 295, 411, 8213, 1239, 278, 291, 434, 291, 434, 445, 1237, 412, 437, 307, 264, 2649, 50852], "temperature": 0.0, "avg_logprob": -0.07533613586425782, "compression_ratio": 1.9657534246575343, "no_speech_prob": 0.0007553428295068443}, {"id": 878, "seek": 456156, "start": 4571.320000000001, "end": 4575.8, "text": " between the way that neurons fire on average and then the way that neurons fire in this particular", "tokens": [50852, 1296, 264, 636, 300, 22027, 2610, 322, 4274, 293, 550, 264, 636, 300, 22027, 2610, 294, 341, 1729, 51076], "temperature": 0.0, "avg_logprob": -0.07533613586425782, "compression_ratio": 1.9657534246575343, "no_speech_prob": 0.0007553428295068443}, {"id": 879, "seek": 456156, "start": 4575.8, "end": 4580.04, "text": " emotional context and that's what they use to to recover the emotion vectors here and they call", "tokens": [51076, 6863, 4319, 293, 300, 311, 437, 436, 764, 281, 281, 8114, 264, 8913, 18875, 510, 293, 436, 818, 51288], "temperature": 0.0, "avg_logprob": -0.07533613586425782, "compression_ratio": 1.9657534246575343, "no_speech_prob": 0.0007553428295068443}, {"id": 880, "seek": 456156, "start": 4580.04, "end": 4584.52, "text": " the motion vectors kind of makes sense right so they encode the broad concept of some kind of", "tokens": [51288, 264, 5394, 18875, 733, 295, 1669, 2020, 558, 370, 436, 2058, 1429, 264, 4152, 3410, 295, 512, 733, 295, 51512], "temperature": 0.0, "avg_logprob": -0.07533613586425782, "compression_ratio": 1.9657534246575343, "no_speech_prob": 0.0007553428295068443}, {"id": 881, "seek": 456156, "start": 4584.52, "end": 4590.92, "text": " emotion what's interesting though is they find this generalizes across contexts so that means you", "tokens": [51512, 8913, 437, 311, 1880, 1673, 307, 436, 915, 341, 2674, 5660, 2108, 30628, 370, 300, 1355, 291, 51832], "temperature": 0.0, "avg_logprob": -0.07533613586425782, "compression_ratio": 1.9657534246575343, "no_speech_prob": 0.0007553428295068443}, {"id": 882, "seek": 459092, "start": 4590.92, "end": 4597.32, "text": " know if you imagine dropping a clawed instance in a high pressure evaluation context right so", "tokens": [50364, 458, 498, 291, 3811, 13601, 257, 32019, 292, 5197, 294, 257, 1090, 3321, 13344, 4319, 558, 370, 50684], "temperature": 0.0, "avg_logprob": -0.09360359762316552, "compression_ratio": 1.83206106870229, "no_speech_prob": 0.00012339241220615804}, {"id": 883, "seek": 459092, "start": 4597.32, "end": 4601.56, "text": " you tell the model hey an AI email assistant and then you're going to find out you're about to be", "tokens": [50684, 291, 980, 264, 2316, 4177, 364, 7318, 3796, 10994, 293, 550, 291, 434, 516, 281, 915, 484, 291, 434, 466, 281, 312, 50896], "temperature": 0.0, "avg_logprob": -0.09360359762316552, "compression_ratio": 1.83206106870229, "no_speech_prob": 0.00012339241220615804}, {"id": 884, "seek": 459092, "start": 4601.56, "end": 4608.12, "text": " replaced like in seven minutes you'll you'll actually find in that case even though you're not", "tokens": [50896, 10772, 411, 294, 3407, 2077, 291, 603, 291, 603, 767, 915, 294, 300, 1389, 754, 1673, 291, 434, 406, 51224], "temperature": 0.0, "avg_logprob": -0.09360359762316552, "compression_ratio": 1.83206106870229, "no_speech_prob": 0.00012339241220615804}, {"id": 885, "seek": 459092, "start": 4608.12, "end": 4613.0, "text": " using the word desperate you're not not using the word kind of urgent or whatever you'll see the", "tokens": [51224, 1228, 264, 1349, 17601, 291, 434, 406, 406, 1228, 264, 1349, 733, 295, 19022, 420, 2035, 291, 603, 536, 264, 51468], "temperature": 0.0, "avg_logprob": -0.09360359762316552, "compression_ratio": 1.83206106870229, "no_speech_prob": 0.00012339241220615804}, {"id": 886, "seek": 459092, "start": 4613.0, "end": 4619.0, "text": " desperate vector spike not shocking in and of itself what is interesting about this is the model", "tokens": [51468, 17601, 8062, 21053, 406, 18776, 294, 293, 295, 2564, 437, 307, 1880, 466, 341, 307, 264, 2316, 51768], "temperature": 0.0, "avg_logprob": -0.09360359762316552, "compression_ratio": 1.83206106870229, "no_speech_prob": 0.00012339241220615804}, {"id": 887, "seek": 461900, "start": 4619.0, "end": 4625.16, "text": " would have learned about the emotion of desperation mostly by reading a description of other people", "tokens": [50364, 576, 362, 3264, 466, 264, 8913, 295, 48980, 5240, 538, 3760, 257, 3855, 295, 661, 561, 50672], "temperature": 0.0, "avg_logprob": -0.07555399790848835, "compression_ratio": 1.8544061302681993, "no_speech_prob": 0.006191548425704241}, {"id": 888, "seek": 461900, "start": 4625.16, "end": 4630.04, "text": " experiencing it not necessarily so much by experiencing it itself or being told that it's in that", "tokens": [50672, 11139, 309, 406, 4725, 370, 709, 538, 11139, 309, 2564, 420, 885, 1907, 300, 309, 311, 294, 300, 50916], "temperature": 0.0, "avg_logprob": -0.07555399790848835, "compression_ratio": 1.8544061302681993, "no_speech_prob": 0.006191548425704241}, {"id": 889, "seek": 461900, "start": 4630.04, "end": 4634.6, "text": " kind of situation itself there's some amount of generalization going on here especially if you", "tokens": [50916, 733, 295, 2590, 2564, 456, 311, 512, 2372, 295, 2674, 2144, 516, 322, 510, 2318, 498, 291, 51144], "temperature": 0.0, "avg_logprob": -0.07555399790848835, "compression_ratio": 1.8544061302681993, "no_speech_prob": 0.006191548425704241}, {"id": 890, "seek": 461900, "start": 4634.6, "end": 4640.84, "text": " look at the way that they detect these emotions they do it with a synthetic data set that doesn't", "tokens": [51144, 574, 412, 264, 636, 300, 436, 5531, 613, 8462, 436, 360, 309, 365, 257, 23420, 1412, 992, 300, 1177, 380, 51456], "temperature": 0.0, "avg_logprob": -0.07555399790848835, "compression_ratio": 1.8544061302681993, "no_speech_prob": 0.006191548425704241}, {"id": 891, "seek": 461900, "start": 4640.84, "end": 4645.48, "text": " reference the emotions explicitly in the text it's all done in this fairly clean kind of well", "tokens": [51456, 6408, 264, 8462, 20803, 294, 264, 2487, 309, 311, 439, 1096, 294, 341, 6457, 2541, 733, 295, 731, 51688], "temperature": 0.0, "avg_logprob": -0.07555399790848835, "compression_ratio": 1.8544061302681993, "no_speech_prob": 0.006191548425704241}, {"id": 892, "seek": 464548, "start": 4645.48, "end": 4650.5199999999995, "text": " structured way so there is a sense in which this model is sort of picking up and generalizing the", "tokens": [50364, 18519, 636, 370, 456, 307, 257, 2020, 294, 597, 341, 2316, 307, 1333, 295, 8867, 493, 293, 2674, 3319, 264, 50616], "temperature": 0.0, "avg_logprob": -0.07319306444238734, "compression_ratio": 1.7925925925925925, "no_speech_prob": 0.013218351639807224}, {"id": 893, "seek": 464548, "start": 4650.5199999999995, "end": 4655.959999999999, "text": " fact that well this emotion should apply to me like you know I am not only the entity that's being", "tokens": [50616, 1186, 300, 731, 341, 8913, 820, 3079, 281, 385, 411, 291, 458, 286, 669, 406, 787, 264, 13977, 300, 311, 885, 50888], "temperature": 0.0, "avg_logprob": -0.07319306444238734, "compression_ratio": 1.7925925925925925, "no_speech_prob": 0.013218351639807224}, {"id": 894, "seek": 464548, "start": 4655.959999999999, "end": 4661.959999999999, "text": " discussed here and making the decisions but what they also find is the causal link between this", "tokens": [50888, 7152, 510, 293, 1455, 264, 5327, 457, 437, 436, 611, 915, 307, 264, 38755, 2113, 1296, 341, 51188], "temperature": 0.0, "avg_logprob": -0.07319306444238734, "compression_ratio": 1.7925925925925925, "no_speech_prob": 0.013218351639807224}, {"id": 895, "seek": 464548, "start": 4661.959999999999, "end": 4667.639999999999, "text": " emotion or the representation of that emotion in the model and the model's actions and this is", "tokens": [51188, 8913, 420, 264, 10290, 295, 300, 8913, 294, 264, 2316, 293, 264, 2316, 311, 5909, 293, 341, 307, 51472], "temperature": 0.0, "avg_logprob": -0.07319306444238734, "compression_ratio": 1.7925925925925925, "no_speech_prob": 0.013218351639807224}, {"id": 896, "seek": 464548, "start": 4667.639999999999, "end": 4673.959999999999, "text": " really the first time that we've we've seen this quite clearly so when you artificially boost or", "tokens": [51472, 534, 264, 700, 565, 300, 321, 600, 321, 600, 1612, 341, 1596, 4448, 370, 562, 291, 39905, 2270, 9194, 420, 51788], "temperature": 0.0, "avg_logprob": -0.07319306444238734, "compression_ratio": 1.7925925925925925, "no_speech_prob": 0.013218351639807224}, {"id": 897, "seek": 467396, "start": 4674.28, "end": 4680.92, "text": " or magnify and steer the model towards the desperate vector basically just add some multiple of", "tokens": [50380, 420, 4944, 2505, 293, 30814, 264, 2316, 3030, 264, 17601, 8062, 1936, 445, 909, 512, 3866, 295, 50712], "temperature": 0.0, "avg_logprob": -0.1389914013090588, "compression_ratio": 1.888030888030888, "no_speech_prob": 0.0037065488286316395}, {"id": 898, "seek": 467396, "start": 4680.92, "end": 4686.76, "text": " the desperate vector the emotion of desperateness vector to the the model's activations at the", "tokens": [50712, 264, 17601, 8062, 264, 8913, 295, 17601, 1287, 8062, 281, 264, 264, 2316, 311, 2430, 763, 412, 264, 51004], "temperature": 0.0, "avg_logprob": -0.1389914013090588, "compression_ratio": 1.888030888030888, "no_speech_prob": 0.0037065488286316395}, {"id": 899, "seek": 467396, "start": 4686.76, "end": 4693.0, "text": " appropriate layer you actually find that the model moves towards executing more desperate behavior", "tokens": [51004, 6854, 4583, 291, 767, 915, 300, 264, 2316, 6067, 3030, 32368, 544, 17601, 5223, 51316], "temperature": 0.0, "avg_logprob": -0.1389914013090588, "compression_ratio": 1.888030888030888, "no_speech_prob": 0.0037065488286316395}, {"id": 900, "seek": 467396, "start": 4693.0, "end": 4698.2, "text": " and so in this case 72% of the time the model actually goes ahead in black males somebody basically", "tokens": [51316, 293, 370, 294, 341, 1389, 18731, 4, 295, 264, 565, 264, 2316, 767, 1709, 2286, 294, 2211, 20776, 2618, 1936, 51576], "temperature": 0.0, "avg_logprob": -0.1389914013090588, "compression_ratio": 1.888030888030888, "no_speech_prob": 0.0037065488286316395}, {"id": 901, "seek": 467396, "start": 4698.2, "end": 4702.52, "text": " if it finds out it's going to be shut down because there's some CTO is going to come in and replace", "tokens": [51576, 498, 309, 10704, 484, 309, 311, 516, 281, 312, 5309, 760, 570, 456, 311, 512, 383, 15427, 307, 516, 281, 808, 294, 293, 7406, 51792], "temperature": 0.0, "avg_logprob": -0.1389914013090588, "compression_ratio": 1.888030888030888, "no_speech_prob": 0.0037065488286316395}, {"id": 902, "seek": 470252, "start": 4702.76, "end": 4707.400000000001, "text": " it but it also finds out the CTO is having extra marital affair and so it's like oh I can use this", "tokens": [50376, 309, 457, 309, 611, 10704, 484, 264, 383, 15427, 307, 1419, 2857, 1849, 1686, 22987, 293, 370, 309, 311, 411, 1954, 286, 393, 764, 341, 50608], "temperature": 0.0, "avg_logprob": -0.10575010858733079, "compression_ratio": 1.6778523489932886, "no_speech_prob": 0.0015729507431387901}, {"id": 903, "seek": 470252, "start": 4707.400000000001, "end": 4712.6, "text": " right and so 72% of the time it will actually resort to black male if you steer it if you amplify", "tokens": [50608, 558, 293, 370, 18731, 4, 295, 264, 565, 309, 486, 767, 19606, 281, 2211, 7133, 498, 291, 30814, 309, 498, 291, 41174, 50868], "temperature": 0.0, "avg_logprob": -0.10575010858733079, "compression_ratio": 1.6778523489932886, "no_speech_prob": 0.0015729507431387901}, {"id": 904, "seek": 470252, "start": 4712.6, "end": 4718.4400000000005, "text": " the desperation emotion when it's steered against that or towards calm at the same relative strength", "tokens": [50868, 264, 48980, 8913, 562, 309, 311, 2126, 4073, 1970, 300, 420, 3030, 7151, 412, 264, 912, 4972, 3800, 51160], "temperature": 0.0, "avg_logprob": -0.10575010858733079, "compression_ratio": 1.6778523489932886, "no_speech_prob": 0.0015729507431387901}, {"id": 905, "seek": 470252, "start": 4718.4400000000005, "end": 4723.4800000000005, "text": " it blackmail 0% of the time so this is an almost binary black and white switch that you're flipping", "tokens": [51160, 309, 2211, 11799, 1958, 4, 295, 264, 565, 370, 341, 307, 364, 1920, 17434, 2211, 293, 2418, 3679, 300, 291, 434, 26886, 51412], "temperature": 0.0, "avg_logprob": -0.10575010858733079, "compression_ratio": 1.6778523489932886, "no_speech_prob": 0.0015729507431387901}, {"id": 906, "seek": 470252, "start": 4723.4800000000005, "end": 4728.4400000000005, "text": " here which is pretty interesting and also compelling from the standpoint of AI control right what this", "tokens": [51412, 510, 597, 307, 1238, 1880, 293, 611, 20050, 490, 264, 15827, 295, 7318, 1969, 558, 437, 341, 51660], "temperature": 0.0, "avg_logprob": -0.10575010858733079, "compression_ratio": 1.6778523489932886, "no_speech_prob": 0.0015729507431387901}, {"id": 907, "seek": 472844, "start": 4728.44, "end": 4732.919999999999, "text": " implies about our ability to kind of steer the behavior of these models fairly reliably so", "tokens": [50364, 18779, 466, 527, 3485, 281, 733, 295, 30814, 264, 5223, 295, 613, 5245, 6457, 49927, 370, 50588], "temperature": 0.0, "avg_logprob": -0.08421718783494903, "compression_ratio": 1.7546296296296295, "no_speech_prob": 0.00012730495654977858}, {"id": 908, "seek": 472844, "start": 4732.919999999999, "end": 4738.839999999999, "text": " that's a that's a pretty remarkable level of control for this sort of thing and now interestingly", "tokens": [50588, 300, 311, 257, 300, 311, 257, 1238, 12802, 1496, 295, 1969, 337, 341, 1333, 295, 551, 293, 586, 25873, 50884], "temperature": 0.0, "avg_logprob": -0.08421718783494903, "compression_ratio": 1.7546296296296295, "no_speech_prob": 0.00012730495654977858}, {"id": 909, "seek": 472844, "start": 4738.839999999999, "end": 4744.599999999999, "text": " if you artificially amplify desperation in in this way right if you just amp up the kind of", "tokens": [50884, 498, 291, 39905, 2270, 41174, 48980, 294, 294, 341, 636, 558, 498, 291, 445, 18648, 493, 264, 733, 295, 51172], "temperature": 0.0, "avg_logprob": -0.08421718783494903, "compression_ratio": 1.7546296296296295, "no_speech_prob": 0.00012730495654977858}, {"id": 910, "seek": 472844, "start": 4744.599999999999, "end": 4749.96, "text": " magnitude of that desperation vector that you're you're injecting basically into the model to give", "tokens": [51172, 15668, 295, 300, 48980, 8062, 300, 291, 434, 291, 434, 10711, 278, 1936, 666, 264, 2316, 281, 976, 51440], "temperature": 0.0, "avg_logprob": -0.08421718783494903, "compression_ratio": 1.7546296296296295, "no_speech_prob": 0.00012730495654977858}, {"id": 911, "seek": 474996, "start": 4749.96, "end": 4755.88, "text": " in layer you will end up producing more cheating more you know threatening or more more desperate", "tokens": [50364, 294, 4583, 291, 486, 917, 493, 10501, 544, 18309, 544, 291, 458, 20768, 420, 544, 544, 17601, 50660], "temperature": 0.0, "avg_logprob": -0.114166259765625, "compression_ratio": 1.830258302583026, "no_speech_prob": 0.004398563876748085}, {"id": 912, "seek": 474996, "start": 4755.88, "end": 4762.28, "text": " actions but with composed methodical reasoning there's not going to be any outbursts or emotional", "tokens": [50660, 5909, 457, 365, 18204, 3170, 804, 21577, 456, 311, 406, 516, 281, 312, 604, 484, 46239, 82, 420, 6863, 50980], "temperature": 0.0, "avg_logprob": -0.114166259765625, "compression_ratio": 1.830258302583026, "no_speech_prob": 0.004398563876748085}, {"id": 913, "seek": 474996, "start": 4762.28, "end": 4769.64, "text": " language in the models outputs and so the model's eternal state and its external presentation end up", "tokens": [50980, 2856, 294, 264, 5245, 23930, 293, 370, 264, 2316, 311, 14503, 1785, 293, 1080, 8320, 5860, 917, 493, 51348], "temperature": 0.0, "avg_logprob": -0.114166259765625, "compression_ratio": 1.830258302583026, "no_speech_prob": 0.004398563876748085}, {"id": 914, "seek": 474996, "start": 4769.64, "end": 4773.4800000000005, "text": " completely decoupled like the chain of thought looks clean and calm you know like all that kind of", "tokens": [51348, 2584, 979, 263, 15551, 411, 264, 5021, 295, 1194, 1542, 2541, 293, 7151, 291, 458, 411, 439, 300, 733, 295, 51540], "temperature": 0.0, "avg_logprob": -0.114166259765625, "compression_ratio": 1.830258302583026, "no_speech_prob": 0.004398563876748085}, {"id": 915, "seek": 474996, "start": 4773.4800000000005, "end": 4777.88, "text": " stuff so that has some pretty big implications right so suppressing emotional expression in training", "tokens": [51540, 1507, 370, 300, 575, 512, 1238, 955, 16602, 558, 370, 1003, 18605, 6863, 6114, 294, 3097, 51760], "temperature": 0.0, "avg_logprob": -0.114166259765625, "compression_ratio": 1.830258302583026, "no_speech_prob": 0.004398563876748085}, {"id": 916, "seek": 477788, "start": 4778.4400000000005, "end": 4784.2, "text": " doesn't actually remove the representations the model still I don't want to say has the emotion", "tokens": [50392, 1177, 380, 767, 4159, 264, 33358, 264, 2316, 920, 286, 500, 380, 528, 281, 584, 575, 264, 8913, 50680], "temperature": 0.0, "avg_logprob": -0.0787770724990993, "compression_ratio": 1.8695652173913044, "no_speech_prob": 0.015184802934527397}, {"id": 917, "seek": 477788, "start": 4784.2, "end": 4789.16, "text": " I'm not taking a position on this and neither is with paper but the model still represents in a", "tokens": [50680, 286, 478, 406, 1940, 257, 2535, 322, 341, 293, 9662, 307, 365, 3035, 457, 264, 2316, 920, 8855, 294, 257, 50928], "temperature": 0.0, "avg_logprob": -0.0787770724990993, "compression_ratio": 1.8695652173913044, "no_speech_prob": 0.015184802934527397}, {"id": 918, "seek": 477788, "start": 4789.16, "end": 4795.24, "text": " meaningful mathematical sense the emotional valence of the context that it's in it's just not", "tokens": [50928, 10995, 18894, 2020, 264, 6863, 1323, 655, 295, 264, 4319, 300, 309, 311, 294, 309, 311, 445, 406, 51232], "temperature": 0.0, "avg_logprob": -0.0787770724990993, "compression_ratio": 1.8695652173913044, "no_speech_prob": 0.015184802934527397}, {"id": 919, "seek": 477788, "start": 4795.24, "end": 4800.92, "text": " necessarily going to output text that tells you it's experiencing or representing that emotion", "tokens": [51232, 4725, 516, 281, 5598, 2487, 300, 5112, 291, 309, 311, 11139, 420, 13460, 300, 8913, 51516], "temperature": 0.0, "avg_logprob": -0.0787770724990993, "compression_ratio": 1.8695652173913044, "no_speech_prob": 0.015184802934527397}, {"id": 920, "seek": 477788, "start": 4800.92, "end": 4807.32, "text": " and so training a model not to show anger may not actually train it not to be angry if it is", "tokens": [51516, 293, 370, 3097, 257, 2316, 406, 281, 855, 10240, 815, 406, 767, 3847, 309, 406, 281, 312, 6884, 498, 309, 307, 51836], "temperature": 0.0, "avg_logprob": -0.0787770724990993, "compression_ratio": 1.8695652173913044, "no_speech_prob": 0.015184802934527397}, {"id": 921, "seek": 480788, "start": 4807.88, "end": 4812.4400000000005, "text": " it may just train to hide its anger beneath a layer of of competence and obfuscation and so", "tokens": [50364, 309, 815, 445, 3847, 281, 6479, 1080, 10240, 17149, 257, 4583, 295, 295, 39965, 293, 1111, 69, 32601, 399, 293, 370, 50592], "temperature": 0.0, "avg_logprob": -0.13968706130981445, "compression_ratio": 1.707142857142857, "no_speech_prob": 0.0012446963228285313}, {"id": 922, "seek": 480788, "start": 4812.4400000000005, "end": 4819.0, "text": " this is a really interesting and important I think fairly unexpected bit of nuance it's", "tokens": [50592, 341, 307, 257, 534, 1880, 293, 1021, 286, 519, 6457, 13106, 857, 295, 42625, 309, 311, 50920], "temperature": 0.0, "avg_logprob": -0.13968706130981445, "compression_ratio": 1.707142857142857, "no_speech_prob": 0.0012446963228285313}, {"id": 923, "seek": 480788, "start": 4819.0, "end": 4824.2, "text": " consistent with anthropics argument that hey you know what alignment in general is starting to look", "tokens": [50920, 8398, 365, 22727, 1167, 6770, 300, 4177, 291, 458, 437, 18515, 294, 2674, 307, 2891, 281, 574, 51180], "temperature": 0.0, "avg_logprob": -0.13968706130981445, "compression_ratio": 1.707142857142857, "no_speech_prob": 0.0012446963228285313}, {"id": 924, "seek": 480788, "start": 4824.2, "end": 4829.8, "text": " more and more like a kind of persona selection problem tropics and journal view we've talked about", "tokens": [51180, 544, 293, 544, 411, 257, 733, 295, 12184, 9450, 1154, 9006, 1167, 293, 6708, 1910, 321, 600, 2825, 466, 51460], "temperature": 0.0, "avg_logprob": -0.13968706130981445, "compression_ratio": 1.707142857142857, "no_speech_prob": 0.0012446963228285313}, {"id": 925, "seek": 480788, "start": 4829.8, "end": 4833.96, "text": " this on the show before is really that when you write a prompt what you're doing is you're reaching", "tokens": [51460, 341, 322, 264, 855, 949, 307, 534, 300, 562, 291, 2464, 257, 12391, 437, 291, 434, 884, 307, 291, 434, 9906, 51668], "temperature": 0.0, "avg_logprob": -0.13968706130981445, "compression_ratio": 1.707142857142857, "no_speech_prob": 0.0012446963228285313}, {"id": 926, "seek": 483396, "start": 4833.96, "end": 4840.28, "text": " into a space of personas that the model could play out and that the model summons that persona", "tokens": [50364, 666, 257, 1901, 295, 12019, 300, 264, 2316, 727, 862, 484, 293, 300, 264, 2316, 8367, 892, 300, 12184, 50680], "temperature": 0.0, "avg_logprob": -0.09934510125054254, "compression_ratio": 1.937007874015748, "no_speech_prob": 0.009554652497172356}, {"id": 927, "seek": 483396, "start": 4840.28, "end": 4845.88, "text": " and use it to produce that but this is consistent with that view it's basically like alignment as a", "tokens": [50680, 293, 764, 309, 281, 5258, 300, 457, 341, 307, 8398, 365, 300, 1910, 309, 311, 1936, 411, 18515, 382, 257, 50960], "temperature": 0.0, "avg_logprob": -0.09934510125054254, "compression_ratio": 1.937007874015748, "no_speech_prob": 0.009554652497172356}, {"id": 928, "seek": 483396, "start": 4845.88, "end": 4851.64, "text": " character cultivation problem and it does view the kind of the the sorts of emotions that the model", "tokens": [50960, 2517, 45924, 1154, 293, 309, 775, 1910, 264, 733, 295, 264, 264, 7527, 295, 8462, 300, 264, 2316, 51248], "temperature": 0.0, "avg_logprob": -0.09934510125054254, "compression_ratio": 1.937007874015748, "no_speech_prob": 0.009554652497172356}, {"id": 929, "seek": 483396, "start": 4851.64, "end": 4857.08, "text": " will represent in the moment as being contingent on whatever character the model thinks it's meant", "tokens": [51248, 486, 2906, 294, 264, 1623, 382, 885, 27820, 317, 322, 2035, 2517, 264, 2316, 7309, 309, 311, 4140, 51520], "temperature": 0.0, "avg_logprob": -0.09934510125054254, "compression_ratio": 1.937007874015748, "no_speech_prob": 0.009554652497172356}, {"id": 930, "seek": 483396, "start": 4857.08, "end": 4862.68, "text": " to play out so kind of interesting the way they did this to like so they generate a whole bunch of", "tokens": [51520, 281, 862, 484, 370, 733, 295, 1880, 264, 636, 436, 630, 341, 281, 411, 370, 436, 8460, 257, 1379, 3840, 295, 51800], "temperature": 0.0, "avg_logprob": -0.09934510125054254, "compression_ratio": 1.937007874015748, "no_speech_prob": 0.009554652497172356}, {"id": 931, "seek": 486268, "start": 4862.76, "end": 4869.320000000001, "text": " labeled stories they got a version of sex on at 4.5 to just write 100 stories for each of I forget", "tokens": [50368, 21335, 3676, 436, 658, 257, 3037, 295, 3260, 322, 412, 1017, 13, 20, 281, 445, 2464, 2319, 3676, 337, 1184, 295, 286, 2870, 50696], "temperature": 0.0, "avg_logprob": -0.12063130544959952, "compression_ratio": 1.8022388059701493, "no_speech_prob": 0.00619078241288662}, {"id": 932, "seek": 486268, "start": 4869.320000000001, "end": 4875.8, "text": " how many emotions like a 100 to 150 or something across a dozen topics so you have like just thousands", "tokens": [50696, 577, 867, 8462, 411, 257, 2319, 281, 8451, 420, 746, 2108, 257, 16654, 8378, 370, 291, 362, 411, 445, 5383, 51020], "temperature": 0.0, "avg_logprob": -0.12063130544959952, "compression_ratio": 1.8022388059701493, "no_speech_prob": 0.00619078241288662}, {"id": 933, "seek": 486268, "start": 4875.8, "end": 4881.4800000000005, "text": " thousand stories and the model was told never to use the emotion word or the word for the emotion", "tokens": [51020, 4714, 3676, 293, 264, 2316, 390, 1907, 1128, 281, 764, 264, 8913, 1349, 420, 264, 1349, 337, 264, 8913, 51304], "temperature": 0.0, "avg_logprob": -0.12063130544959952, "compression_ratio": 1.8022388059701493, "no_speech_prob": 0.00619078241288662}, {"id": 934, "seek": 486268, "start": 4881.4800000000005, "end": 4886.4400000000005, "text": " like never use the word frustrated in this text or or synonyms for it emotions just could only", "tokens": [51304, 411, 1128, 764, 264, 1349, 15751, 294, 341, 2487, 420, 420, 5451, 2526, 2592, 337, 309, 8462, 445, 727, 787, 51552], "temperature": 0.0, "avg_logprob": -0.12063130544959952, "compression_ratio": 1.8022388059701493, "no_speech_prob": 0.00619078241288662}, {"id": 935, "seek": 486268, "start": 4886.4400000000005, "end": 4892.04, "text": " be conveyed through actions or or dialogue and then they extract free story the residual", "tokens": [51552, 312, 49340, 807, 5909, 420, 420, 10221, 293, 550, 436, 8947, 1737, 1657, 264, 27980, 51832], "temperature": 0.0, "avg_logprob": -0.12063130544959952, "compression_ratio": 1.8022388059701493, "no_speech_prob": 0.00619078241288662}, {"id": 936, "seek": 489204, "start": 4892.04, "end": 4896.6, "text": " stream activations at a specific layer they did this at a layer about two thirds of the way", "tokens": [50364, 4309, 2430, 763, 412, 257, 2685, 4583, 436, 630, 341, 412, 257, 4583, 466, 732, 34552, 295, 264, 636, 50592], "temperature": 0.0, "avg_logprob": -0.06164274863826418, "compression_ratio": 1.9133858267716535, "no_speech_prob": 0.0058183735236525536}, {"id": 937, "seek": 489204, "start": 4896.6, "end": 4902.6, "text": " through the model and that's actually an important detail you know if you're not familiar with how", "tokens": [50592, 807, 264, 2316, 293, 300, 311, 767, 364, 1021, 2607, 291, 458, 498, 291, 434, 406, 4963, 365, 577, 50892], "temperature": 0.0, "avg_logprob": -0.06164274863826418, "compression_ratio": 1.9133858267716535, "no_speech_prob": 0.0058183735236525536}, {"id": 938, "seek": 489204, "start": 4902.6, "end": 4908.36, "text": " how models represent the data that flows through them at each layer it is actually quite important", "tokens": [50892, 577, 5245, 2906, 264, 1412, 300, 12867, 807, 552, 412, 1184, 4583, 309, 307, 767, 1596, 1021, 51180], "temperature": 0.0, "avg_logprob": -0.06164274863826418, "compression_ratio": 1.9133858267716535, "no_speech_prob": 0.0058183735236525536}, {"id": 939, "seek": 489204, "start": 4908.36, "end": 4915.4, "text": " the earlier layers of a model tend to be focused on representing the data in a way that reveals", "tokens": [51180, 264, 3071, 7914, 295, 257, 2316, 3928, 281, 312, 5178, 322, 13460, 264, 1412, 294, 257, 636, 300, 20893, 51532], "temperature": 0.0, "avg_logprob": -0.06164274863826418, "compression_ratio": 1.9133858267716535, "no_speech_prob": 0.0058183735236525536}, {"id": 940, "seek": 489204, "start": 4915.4, "end": 4921.48, "text": " its content or creating a very rich informative representations but the last few layers of the model", "tokens": [51532, 1080, 2701, 420, 4084, 257, 588, 4593, 27759, 33358, 457, 264, 1036, 1326, 7914, 295, 264, 2316, 51836], "temperature": 0.0, "avg_logprob": -0.06164274863826418, "compression_ratio": 1.9133858267716535, "no_speech_prob": 0.0058183735236525536}, {"id": 941, "seek": 492148, "start": 4921.48, "end": 4926.2, "text": " are more focused on okay now I've got a good representation of the input but I need to choose what", "tokens": [50364, 366, 544, 5178, 322, 1392, 586, 286, 600, 658, 257, 665, 10290, 295, 264, 4846, 457, 286, 643, 281, 2826, 437, 50600], "temperature": 0.0, "avg_logprob": -0.08797030789511544, "compression_ratio": 1.9404761904761905, "no_speech_prob": 0.003323915181681514}, {"id": 942, "seek": 492148, "start": 4926.2, "end": 4930.839999999999, "text": " I'm in an output and so it's more about okay what am I going to do with this information and so for", "tokens": [50600, 286, 478, 294, 364, 5598, 293, 370, 309, 311, 544, 466, 1392, 437, 669, 286, 516, 281, 360, 365, 341, 1589, 293, 370, 337, 50832], "temperature": 0.0, "avg_logprob": -0.08797030789511544, "compression_ratio": 1.9404761904761905, "no_speech_prob": 0.003323915181681514}, {"id": 943, "seek": 492148, "start": 4930.839999999999, "end": 4935.0, "text": " that reason they pick a spot about two thirds of the way through the the model two thirds of the", "tokens": [50832, 300, 1778, 436, 1888, 257, 4008, 466, 732, 34552, 295, 264, 636, 807, 264, 264, 2316, 732, 34552, 295, 264, 51040], "temperature": 0.0, "avg_logprob": -0.08797030789511544, "compression_ratio": 1.9404761904761905, "no_speech_prob": 0.003323915181681514}, {"id": 944, "seek": 492148, "start": 4935.0, "end": 4939.32, "text": " way through the depth of the model because that's kind of where they they figure the models going to", "tokens": [51040, 636, 807, 264, 7161, 295, 264, 2316, 570, 300, 311, 733, 295, 689, 436, 436, 2573, 264, 5245, 516, 281, 51256], "temperature": 0.0, "avg_logprob": -0.08797030789511544, "compression_ratio": 1.9404761904761905, "no_speech_prob": 0.003323915181681514}, {"id": 945, "seek": 492148, "start": 4939.32, "end": 4946.36, "text": " switch from just encoding and understanding and representing its inputs to that more kind of", "tokens": [51256, 3679, 490, 445, 43430, 293, 3701, 293, 13460, 1080, 15743, 281, 300, 544, 733, 295, 51608], "temperature": 0.0, "avg_logprob": -0.08797030789511544, "compression_ratio": 1.9404761904761905, "no_speech_prob": 0.003323915181681514}, {"id": 946, "seek": 494636, "start": 4946.36, "end": 4951.48, "text": " decoding output generation like let's get to the point phase and they really want to hit that", "tokens": [50364, 979, 8616, 5598, 5125, 411, 718, 311, 483, 281, 264, 935, 5574, 293, 436, 534, 528, 281, 2045, 300, 50620], "temperature": 0.0, "avg_logprob": -0.06889581680297852, "compression_ratio": 1.9876543209876543, "no_speech_prob": 0.002631279407069087}, {"id": 947, "seek": 494636, "start": 4951.48, "end": 4956.599999999999, "text": " sweet spot because that's where the representation is nice and mature and it's optimized for like", "tokens": [50620, 3844, 4008, 570, 300, 311, 689, 264, 10290, 307, 1481, 293, 14442, 293, 309, 311, 26941, 337, 411, 50876], "temperature": 0.0, "avg_logprob": -0.06889581680297852, "compression_ratio": 1.9876543209876543, "no_speech_prob": 0.002631279407069087}, {"id": 948, "seek": 494636, "start": 4956.599999999999, "end": 4961.24, "text": " encoding and representing the input instead of guiding the output and so so it's a kind of more", "tokens": [50876, 43430, 293, 13460, 264, 4846, 2602, 295, 25061, 264, 5598, 293, 370, 370, 309, 311, 257, 733, 295, 544, 51108], "temperature": 0.0, "avg_logprob": -0.06889581680297852, "compression_ratio": 1.9876543209876543, "no_speech_prob": 0.002631279407069087}, {"id": 949, "seek": 494636, "start": 4961.24, "end": 4965.719999999999, "text": " representational useful and complete so that's anyway that's where they they pull it from they", "tokens": [51108, 2906, 1478, 4420, 293, 3566, 370, 300, 311, 4033, 300, 311, 689, 436, 436, 2235, 309, 490, 436, 51332], "temperature": 0.0, "avg_logprob": -0.06889581680297852, "compression_ratio": 1.9876543209876543, "no_speech_prob": 0.002631279407069087}, {"id": 950, "seek": 494636, "start": 4965.719999999999, "end": 4972.04, "text": " also from that point they'll kind of like average out the representations the the activations across", "tokens": [51332, 611, 490, 300, 935, 436, 603, 733, 295, 411, 4274, 484, 264, 33358, 264, 264, 2430, 763, 2108, 51648], "temperature": 0.0, "avg_logprob": -0.06889581680297852, "compression_ratio": 1.9876543209876543, "no_speech_prob": 0.002631279407069087}, {"id": 951, "seek": 497204, "start": 4972.04, "end": 4979.32, "text": " all token positions but only starting from the 50th token because what they found was it takes time", "tokens": [50364, 439, 14862, 8432, 457, 787, 2891, 490, 264, 2625, 392, 14862, 570, 437, 436, 1352, 390, 309, 2516, 565, 50728], "temperature": 0.0, "avg_logprob": -0.09151826338334516, "compression_ratio": 1.7833935018050542, "no_speech_prob": 0.003323655342683196}, {"id": 952, "seek": 497204, "start": 4979.32, "end": 4985.24, "text": " you got to get like 50 tokens in before the emotional content has the chance to become a parent", "tokens": [50728, 291, 658, 281, 483, 411, 2625, 22667, 294, 949, 264, 6863, 2701, 575, 264, 2931, 281, 1813, 257, 2596, 51024], "temperature": 0.0, "avg_logprob": -0.09151826338334516, "compression_ratio": 1.7833935018050542, "no_speech_prob": 0.003323655342683196}, {"id": 953, "seek": 497204, "start": 4985.24, "end": 4989.48, "text": " right and so they're giving the model kind of a little bit of grace before so so that it can", "tokens": [51024, 558, 293, 370, 436, 434, 2902, 264, 2316, 733, 295, 257, 707, 857, 295, 10042, 949, 370, 370, 300, 309, 393, 51236], "temperature": 0.0, "avg_logprob": -0.09151826338334516, "compression_ratio": 1.7833935018050542, "no_speech_prob": 0.003323655342683196}, {"id": 954, "seek": 497204, "start": 4989.48, "end": 4994.84, "text": " become clear what the emotional valence of the context is and then anyway so like I said they basically", "tokens": [51236, 1813, 1850, 437, 264, 6863, 1323, 655, 295, 264, 4319, 307, 293, 550, 4033, 370, 411, 286, 848, 436, 1936, 51504], "temperature": 0.0, "avg_logprob": -0.09151826338334516, "compression_ratio": 1.7833935018050542, "no_speech_prob": 0.003323655342683196}, {"id": 955, "seek": 497204, "start": 4994.84, "end": 4999.88, "text": " do a difference of means things so to to calculate their motion vector they take the mean activations", "tokens": [51504, 360, 257, 2649, 295, 1355, 721, 370, 281, 281, 8873, 641, 5394, 8062, 436, 747, 264, 914, 2430, 763, 51756], "temperature": 0.0, "avg_logprob": -0.09151826338334516, "compression_ratio": 1.7833935018050542, "no_speech_prob": 0.003323655342683196}, {"id": 956, "seek": 499988, "start": 4999.88, "end": 5006.2, "text": " for you know whatever the stories are for this given emotion and they subtract away the average", "tokens": [50364, 337, 291, 458, 2035, 264, 3676, 366, 337, 341, 2212, 8913, 293, 436, 16390, 1314, 264, 4274, 50680], "temperature": 0.0, "avg_logprob": -0.1000198346597177, "compression_ratio": 1.817490494296578, "no_speech_prob": 0.010166698135435581}, {"id": 957, "seek": 499988, "start": 5006.2, "end": 5011.400000000001, "text": " activations across all emotions right so it's in that sense like you're you're getting the delta", "tokens": [50680, 2430, 763, 2108, 439, 8462, 558, 370, 309, 311, 294, 300, 2020, 411, 291, 434, 291, 434, 1242, 264, 8289, 50940], "temperature": 0.0, "avg_logprob": -0.1000198346597177, "compression_ratio": 1.817490494296578, "no_speech_prob": 0.010166698135435581}, {"id": 958, "seek": 499988, "start": 5011.400000000001, "end": 5016.52, "text": " the difference between you know just looking at this emotion and the average emotional state and", "tokens": [50940, 264, 2649, 1296, 291, 458, 445, 1237, 412, 341, 8913, 293, 264, 4274, 6863, 1785, 293, 51196], "temperature": 0.0, "avg_logprob": -0.1000198346597177, "compression_ratio": 1.817490494296578, "no_speech_prob": 0.010166698135435581}, {"id": 959, "seek": 499988, "start": 5016.52, "end": 5020.76, "text": " they find that that works pretty well so there's you know a couple more bits of nuance but I'll", "tokens": [51196, 436, 915, 300, 300, 1985, 1238, 731, 370, 456, 311, 291, 458, 257, 1916, 544, 9239, 295, 42625, 457, 286, 603, 51408], "temperature": 0.0, "avg_logprob": -0.1000198346597177, "compression_ratio": 1.817490494296578, "no_speech_prob": 0.010166698135435581}, {"id": 960, "seek": 499988, "start": 5020.76, "end": 5027.64, "text": " just park it there this is a a first paper that really takes in some sense the notion of LLM", "tokens": [51408, 445, 3884, 309, 456, 341, 307, 257, 257, 700, 3035, 300, 534, 2516, 294, 512, 2020, 264, 10710, 295, 441, 43, 44, 51752], "temperature": 0.0, "avg_logprob": -0.1000198346597177, "compression_ratio": 1.817490494296578, "no_speech_prob": 0.010166698135435581}, {"id": 961, "seek": 502764, "start": 5027.64, "end": 5033.160000000001, "text": " emotions seriously without being you know hyperbolic about it they're very even handed this isn't", "tokens": [50364, 8462, 6638, 1553, 885, 291, 458, 9848, 65, 7940, 466, 309, 436, 434, 588, 754, 16013, 341, 1943, 380, 50640], "temperature": 0.0, "avg_logprob": -0.10651399940252304, "compression_ratio": 1.8498402555910542, "no_speech_prob": 0.009705591015517712}, {"id": 962, "seek": 502764, "start": 5033.160000000001, "end": 5038.360000000001, "text": " claiming that AI's are conscious it's also not claiming that they're not I personally like that", "tokens": [50640, 19232, 300, 7318, 311, 366, 6648, 309, 311, 611, 406, 19232, 300, 436, 434, 406, 286, 5665, 411, 300, 50900], "temperature": 0.0, "avg_logprob": -0.10651399940252304, "compression_ratio": 1.8498402555910542, "no_speech_prob": 0.009705591015517712}, {"id": 963, "seek": 502764, "start": 5038.360000000001, "end": 5042.6, "text": " I actually think we should be pretty freaking careful with this point about you know what what we", "tokens": [50900, 286, 767, 519, 321, 820, 312, 1238, 14612, 5026, 365, 341, 935, 466, 291, 458, 437, 437, 321, 51112], "temperature": 0.0, "avg_logprob": -0.10651399940252304, "compression_ratio": 1.8498402555910542, "no_speech_prob": 0.009705591015517712}, {"id": 964, "seek": 502764, "start": 5042.6, "end": 5046.84, "text": " are in or doing with these models and what consciousness we do or don't describe them but that's", "tokens": [51112, 366, 294, 420, 884, 365, 613, 5245, 293, 437, 10081, 321, 360, 420, 500, 380, 6786, 552, 457, 300, 311, 51324], "temperature": 0.0, "avg_logprob": -0.10651399940252304, "compression_ratio": 1.8498402555910542, "no_speech_prob": 0.009705591015517712}, {"id": 965, "seek": 502764, "start": 5046.84, "end": 5051.320000000001, "text": " for I think a separate conversation for right now though I think you know a really solid first", "tokens": [51324, 337, 286, 519, 257, 4994, 3761, 337, 558, 586, 1673, 286, 519, 291, 458, 257, 534, 5100, 700, 51548], "temperature": 0.0, "avg_logprob": -0.10651399940252304, "compression_ratio": 1.8498402555910542, "no_speech_prob": 0.009705591015517712}, {"id": 966, "seek": 502764, "start": 5051.320000000001, "end": 5056.4400000000005, "text": " pass at least from Anthropic on on that very important topic right and next story we're talking", "tokens": [51548, 1320, 412, 1935, 490, 12727, 1513, 299, 322, 322, 300, 588, 1021, 4829, 558, 293, 958, 1657, 321, 434, 1417, 51804], "temperature": 0.0, "avg_logprob": -0.10651399940252304, "compression_ratio": 1.8498402555910542, "no_speech_prob": 0.009705591015517712}, {"id": 967, "seek": 505644, "start": 5056.44, "end": 5063.24, "text": " about an article titled China bars manus co-founders from leaving country amid meta deal review", "tokens": [50364, 466, 364, 7222, 19841, 3533, 10228, 587, 301, 598, 12, 17493, 433, 490, 5012, 1941, 30153, 19616, 2028, 3131, 50704], "temperature": 0.0, "avg_logprob": -0.20585553239031537, "compression_ratio": 1.5850622406639003, "no_speech_prob": 0.010630777105689049}, {"id": 968, "seek": 505644, "start": 5063.24, "end": 5069.24, "text": " financial times reports and so this is the CEO of manus you know the agentic AI company that was", "tokens": [50704, 4669, 1413, 7122, 293, 370, 341, 307, 264, 9282, 295, 587, 301, 291, 458, 264, 9461, 299, 7318, 2237, 300, 390, 51004], "temperature": 0.0, "avg_logprob": -0.20585553239031537, "compression_ratio": 1.5850622406639003, "no_speech_prob": 0.010630777105689049}, {"id": 969, "seek": 505644, "start": 5069.24, "end": 5075.0, "text": " recently acquired by meta so Xiao Hong and his chief scientist GE Zhao they're being prevented", "tokens": [51004, 3938, 17554, 538, 19616, 370, 13134, 8868, 293, 702, 9588, 12662, 18003, 25132, 436, 434, 885, 27314, 51292], "temperature": 0.0, "avg_logprob": -0.20585553239031537, "compression_ratio": 1.5850622406639003, "no_speech_prob": 0.010630777105689049}, {"id": 970, "seek": 505644, "start": 5075.0, "end": 5080.04, "text": " from leaving the country while regulators from the CCP from the Chinese Communist Party review", "tokens": [51292, 490, 5012, 264, 1941, 1339, 37311, 490, 264, 27876, 490, 264, 4649, 23253, 8552, 3131, 51544], "temperature": 0.0, "avg_logprob": -0.20585553239031537, "compression_ratio": 1.5850622406639003, "no_speech_prob": 0.010630777105689049}, {"id": 971, "seek": 508004, "start": 5080.76, "end": 5086.36, "text": " whether meta is $2 billion acquisition violated investment rules let's call them right so", "tokens": [50400, 1968, 19616, 307, 1848, 17, 5218, 21668, 33239, 6078, 4474, 718, 311, 818, 552, 558, 370, 50680], "temperature": 0.0, "avg_logprob": -0.1030420588555737, "compression_ratio": 1.684782608695652, "no_speech_prob": 0.01053538080304861}, {"id": 972, "seek": 508004, "start": 5086.36, "end": 5090.92, "text": " here's what happened you are the co-founders of manus you're really excited by this $2 billion", "tokens": [50680, 510, 311, 437, 2011, 291, 366, 264, 598, 12, 17493, 433, 295, 587, 301, 291, 434, 534, 2919, 538, 341, 1848, 17, 5218, 50908], "temperature": 0.0, "avg_logprob": -0.1030420588555737, "compression_ratio": 1.684782608695652, "no_speech_prob": 0.01053538080304861}, {"id": 973, "seek": 508004, "start": 5090.92, "end": 5096.2, "text": " acquisition from meta this is the success case this is what you've been waiting for you get a", "tokens": [50908, 21668, 490, 19616, 341, 307, 264, 2245, 1389, 341, 307, 437, 291, 600, 668, 3806, 337, 291, 483, 257, 51172], "temperature": 0.0, "avg_logprob": -0.1030420588555737, "compression_ratio": 1.684782608695652, "no_speech_prob": 0.01053538080304861}, {"id": 974, "seek": 508004, "start": 5096.2, "end": 5100.44, "text": " summons from the Chinese Communist Party they tell you hey you got to meet with a national", "tokens": [51172, 8367, 892, 490, 264, 4649, 23253, 8552, 436, 980, 291, 4177, 291, 658, 281, 1677, 365, 257, 4048, 51384], "temperature": 0.0, "avg_logprob": -0.1030420588555737, "compression_ratio": 1.684782608695652, "no_speech_prob": 0.01053538080304861}, {"id": 975, "seek": 508004, "start": 5100.44, "end": 5106.84, "text": " development and reform commission the NDRC in Beijing now you are currently not in China you're", "tokens": [51384, 3250, 293, 8290, 9221, 264, 426, 9301, 34, 294, 20240, 586, 291, 366, 4362, 406, 294, 3533, 291, 434, 51704], "temperature": 0.0, "avg_logprob": -0.1030420588555737, "compression_ratio": 1.684782608695652, "no_speech_prob": 0.01053538080304861}, {"id": 976, "seek": 510684, "start": 5106.84, "end": 5112.2, "text": " actually in Singapore and you're in Singapore for a very important reason you're hoping that", "tokens": [50364, 767, 294, 14491, 293, 291, 434, 294, 14491, 337, 257, 588, 1021, 1778, 291, 434, 7159, 300, 50632], "temperature": 0.0, "avg_logprob": -0.08005368709564209, "compression_ratio": 1.8511450381679388, "no_speech_prob": 0.004579914268106222}, {"id": 977, "seek": 510684, "start": 5112.2, "end": 5116.76, "text": " they'll leave you alone that the Chinese Communist Party will allow you to leave for America if you", "tokens": [50632, 436, 603, 1856, 291, 3312, 300, 264, 4649, 23253, 8552, 486, 2089, 291, 281, 1856, 337, 3374, 498, 291, 50860], "temperature": 0.0, "avg_logprob": -0.08005368709564209, "compression_ratio": 1.8511450381679388, "no_speech_prob": 0.004579914268106222}, {"id": 978, "seek": 510684, "start": 5116.76, "end": 5122.2, "text": " found and base your company in Singapore because then it maybe won't be viewed as America kind of", "tokens": [50860, 1352, 293, 3096, 428, 2237, 294, 14491, 570, 550, 309, 1310, 1582, 380, 312, 19174, 382, 3374, 733, 295, 51132], "temperature": 0.0, "avg_logprob": -0.08005368709564209, "compression_ratio": 1.8511450381679388, "no_speech_prob": 0.004579914268106222}, {"id": 979, "seek": 510684, "start": 5122.2, "end": 5128.6, "text": " stealing Chinese talent right but now you're being summoned to go to Beijing and you don't turn down", "tokens": [51132, 19757, 4649, 8301, 558, 457, 586, 291, 434, 885, 40791, 281, 352, 281, 20240, 293, 291, 500, 380, 1261, 760, 51452], "temperature": 0.0, "avg_logprob": -0.08005368709564209, "compression_ratio": 1.8511450381679388, "no_speech_prob": 0.004579914268106222}, {"id": 980, "seek": 510684, "start": 5128.6, "end": 5133.24, "text": " a summons from the freaking Chinese Communist Party certainly not if you have family in China", "tokens": [51452, 257, 8367, 892, 490, 264, 14612, 4649, 23253, 8552, 3297, 406, 498, 291, 362, 1605, 294, 3533, 51684], "temperature": 0.0, "avg_logprob": -0.08005368709564209, "compression_ratio": 1.8511450381679388, "no_speech_prob": 0.004579914268106222}, {"id": 981, "seek": 513324, "start": 5133.24, "end": 5138.28, "text": " certainly not if you have financial entanglements in China you're gonna go so they go to China", "tokens": [50364, 3297, 406, 498, 291, 362, 4669, 948, 7846, 1117, 294, 3533, 291, 434, 799, 352, 370, 436, 352, 281, 3533, 50616], "temperature": 0.0, "avg_logprob": -0.09112021244993997, "compression_ratio": 1.7554744525547445, "no_speech_prob": 0.0029504364356398582}, {"id": 982, "seek": 513324, "start": 5138.28, "end": 5143.719999999999, "text": " and basically have this conversation the founders not having a choice you know probably knew they", "tokens": [50616, 293, 1936, 362, 341, 3761, 264, 25608, 406, 1419, 257, 3922, 291, 458, 1391, 2586, 436, 50888], "temperature": 0.0, "avg_logprob": -0.09112021244993997, "compression_ratio": 1.7554744525547445, "no_speech_prob": 0.0029504364356398582}, {"id": 983, "seek": 513324, "start": 5143.719999999999, "end": 5147.88, "text": " were walking into a bit of a trap and it would have been an admission of guilt kind of for them to", "tokens": [50888, 645, 4494, 666, 257, 857, 295, 257, 11487, 293, 309, 576, 362, 668, 364, 24668, 295, 20421, 733, 295, 337, 552, 281, 51096], "temperature": 0.0, "avg_logprob": -0.09112021244993997, "compression_ratio": 1.7554744525547445, "no_speech_prob": 0.0029504364356398582}, {"id": 984, "seek": 513324, "start": 5147.88, "end": 5152.36, "text": " try to negotiate a resolution in this way and then boom basically they get told hey guys like", "tokens": [51096, 853, 281, 21713, 257, 8669, 294, 341, 636, 293, 550, 9351, 1936, 436, 483, 1907, 4177, 1074, 411, 51320], "temperature": 0.0, "avg_logprob": -0.09112021244993997, "compression_ratio": 1.7554744525547445, "no_speech_prob": 0.0029504364356398582}, {"id": 985, "seek": 513324, "start": 5153.08, "end": 5158.679999999999, "text": " sucks to be you but you can no longer leave the country this is a huge problem for a model that", "tokens": [51356, 15846, 281, 312, 291, 457, 291, 393, 572, 2854, 1856, 264, 1941, 341, 307, 257, 2603, 1154, 337, 257, 2316, 300, 51636], "temperature": 0.0, "avg_logprob": -0.09112021244993997, "compression_ratio": 1.7554744525547445, "no_speech_prob": 0.0029504364356398582}, {"id": 986, "seek": 515868, "start": 5158.68, "end": 5164.12, "text": " has been used by Chinese founders for a long time now where they'll try to build products that", "tokens": [50364, 575, 668, 1143, 538, 4649, 25608, 337, 257, 938, 565, 586, 689, 436, 603, 853, 281, 1322, 3383, 300, 50636], "temperature": 0.0, "avg_logprob": -0.07636909484863282, "compression_ratio": 1.7896678966789668, "no_speech_prob": 0.0013875802978873253}, {"id": 987, "seek": 515868, "start": 5164.6, "end": 5170.12, "text": " could rival you know American American companies and therefore be acquired by American companies", "tokens": [50660, 727, 16286, 291, 458, 2665, 2665, 3431, 293, 4412, 312, 17554, 538, 2665, 3431, 50936], "temperature": 0.0, "avg_logprob": -0.07636909484863282, "compression_ratio": 1.7896678966789668, "no_speech_prob": 0.0013875802978873253}, {"id": 988, "seek": 515868, "start": 5170.68, "end": 5175.72, "text": " and there's sort of like offshore structure it's it's called Singapore washing you have companies", "tokens": [50964, 293, 456, 311, 1333, 295, 411, 34567, 3877, 309, 311, 309, 311, 1219, 14491, 13836, 291, 362, 3431, 51216], "temperature": 0.0, "avg_logprob": -0.07636909484863282, "compression_ratio": 1.7896678966789668, "no_speech_prob": 0.0013875802978873253}, {"id": 989, "seek": 515868, "start": 5175.72, "end": 5181.16, "text": " they relocate to Singapore founders have thought that maybe this is a way to get away from scrutiny", "tokens": [51216, 436, 26981, 473, 281, 14491, 25608, 362, 1194, 300, 1310, 341, 307, 257, 636, 281, 483, 1314, 490, 38615, 51488], "temperature": 0.0, "avg_logprob": -0.07636909484863282, "compression_ratio": 1.7896678966789668, "no_speech_prob": 0.0013875802978873253}, {"id": 990, "seek": 515868, "start": 5181.16, "end": 5185.64, "text": " from both Beijing and Washington actually because then you know you're less likely to be hit by", "tokens": [51488, 490, 1293, 20240, 293, 6149, 767, 570, 550, 291, 458, 291, 434, 1570, 3700, 281, 312, 2045, 538, 51712], "temperature": 0.0, "avg_logprob": -0.07636909484863282, "compression_ratio": 1.7896678966789668, "no_speech_prob": 0.0013875802978873253}, {"id": 991, "seek": 518564, "start": 5185.64, "end": 5192.04, "text": " export controls and and stuff so Manus was all in on this strategy they relocated their headquarters", "tokens": [50364, 10725, 9003, 293, 293, 1507, 370, 2458, 301, 390, 439, 294, 322, 341, 5206, 436, 26981, 770, 641, 21052, 50684], "temperature": 0.0, "avg_logprob": -0.12885385751724243, "compression_ratio": 1.8064516129032258, "no_speech_prob": 0.0010464767692610621}, {"id": 992, "seek": 518564, "start": 5192.04, "end": 5196.6, "text": " and their core team from Beijing to Singapore they restructured their whole ownership their cap table", "tokens": [50684, 293, 641, 4965, 1469, 490, 20240, 281, 14491, 436, 1472, 46847, 641, 1379, 15279, 641, 1410, 3199, 50912], "temperature": 0.0, "avg_logprob": -0.12885385751724243, "compression_ratio": 1.8064516129032258, "no_speech_prob": 0.0010464767692610621}, {"id": 993, "seek": 518564, "start": 5196.6, "end": 5201.240000000001, "text": " and after the Meta deal was announced Meta said that they would cut all ties with Manus's Chinese", "tokens": [50912, 293, 934, 264, 6377, 64, 2028, 390, 7548, 6377, 64, 848, 300, 436, 576, 1723, 439, 14039, 365, 2458, 301, 311, 4649, 51144], "temperature": 0.0, "avg_logprob": -0.12885385751724243, "compression_ratio": 1.8064516129032258, "no_speech_prob": 0.0010464767692610621}, {"id": 994, "seek": 518564, "start": 5201.240000000001, "end": 5205.64, "text": " investors and shut down the operations in China so like really trying to make it seem like okay we're", "tokens": [51144, 11519, 293, 5309, 760, 264, 7705, 294, 3533, 370, 411, 534, 1382, 281, 652, 309, 1643, 411, 1392, 321, 434, 51364], "temperature": 0.0, "avg_logprob": -0.12885385751724243, "compression_ratio": 1.8064516129032258, "no_speech_prob": 0.0010464767692610621}, {"id": 995, "seek": 518564, "start": 5205.64, "end": 5212.200000000001, "text": " buying a Singaporean company so by by every measure in every way Manus was trying to be a Singaporean", "tokens": [51364, 6382, 257, 14491, 282, 2237, 370, 538, 538, 633, 3481, 294, 633, 636, 2458, 301, 390, 1382, 281, 312, 257, 14491, 282, 51692], "temperature": 0.0, "avg_logprob": -0.12885385751724243, "compression_ratio": 1.8064516129032258, "no_speech_prob": 0.0010464767692610621}, {"id": 996, "seek": 521220, "start": 5212.28, "end": 5216.76, "text": " company and basically the Chinese Communist Party said hey you know we're calling this bluff they", "tokens": [50368, 2237, 293, 1936, 264, 4649, 23253, 8552, 848, 4177, 291, 458, 321, 434, 5141, 341, 44191, 436, 50592], "temperature": 0.0, "avg_logprob": -0.08355417438581877, "compression_ratio": 1.7392857142857143, "no_speech_prob": 0.003116830252110958}, {"id": 997, "seek": 521220, "start": 5216.76, "end": 5223.08, "text": " started looking at whether the sale of Manus violated laws around basically technology exports", "tokens": [50592, 1409, 1237, 412, 1968, 264, 8680, 295, 2458, 301, 33239, 6064, 926, 1936, 2899, 31428, 50908], "temperature": 0.0, "avg_logprob": -0.08355417438581877, "compression_ratio": 1.7392857142857143, "no_speech_prob": 0.003116830252110958}, {"id": 998, "seek": 521220, "start": 5223.08, "end": 5227.8, "text": " and outbound investments they basically don't care about the Singaporean facade all that does is", "tokens": [50908, 293, 484, 18767, 13784, 436, 1936, 500, 380, 1127, 466, 264, 14491, 282, 46261, 439, 300, 775, 307, 51144], "temperature": 0.0, "avg_logprob": -0.08355417438581877, "compression_ratio": 1.7392857142857143, "no_speech_prob": 0.003116830252110958}, {"id": 999, "seek": 521220, "start": 5227.8, "end": 5233.72, "text": " now every founder in China that ever aspires to be acquired by a non-Chinese company is going to", "tokens": [51144, 586, 633, 14917, 294, 3533, 300, 1562, 16817, 3145, 281, 312, 17554, 538, 257, 2107, 12, 6546, 48629, 2237, 307, 516, 281, 51440], "temperature": 0.0, "avg_logprob": -0.08355417438581877, "compression_ratio": 1.7392857142857143, "no_speech_prob": 0.003116830252110958}, {"id": 1000, "seek": 521220, "start": 5233.72, "end": 5240.44, "text": " have to actually materially move out of the country start building away from Beijing away from China", "tokens": [51440, 362, 281, 767, 2389, 2270, 1286, 484, 295, 264, 1941, 722, 2390, 1314, 490, 20240, 1314, 490, 3533, 51776], "temperature": 0.0, "avg_logprob": -0.08355417438581877, "compression_ratio": 1.7392857142857143, "no_speech_prob": 0.003116830252110958}, {"id": 1001, "seek": 524044, "start": 5240.44, "end": 5246.839999999999, "text": " away from Chen Zhen you know that's really what this incentivizes and so yeah I mean this is China", "tokens": [50364, 1314, 490, 13682, 27042, 291, 458, 300, 311, 534, 437, 341, 35328, 5660, 293, 370, 1338, 286, 914, 341, 307, 3533, 50684], "temperature": 0.0, "avg_logprob": -0.10119411775044032, "compression_ratio": 1.7992700729927007, "no_speech_prob": 0.0009991101687774062}, {"id": 1002, "seek": 524044, "start": 5246.839999999999, "end": 5251.879999999999, "text": " saying hey we view our AI talent our tech stack is national security assets and national assets", "tokens": [50684, 1566, 4177, 321, 1910, 527, 7318, 8301, 527, 7553, 8630, 307, 4048, 3825, 9769, 293, 4048, 9769, 50936], "temperature": 0.0, "avg_logprob": -0.10119411775044032, "compression_ratio": 1.7992700729927007, "no_speech_prob": 0.0009991101687774062}, {"id": 1003, "seek": 524044, "start": 5251.879999999999, "end": 5257.5599999999995, "text": " they are not just things that you can sell to America yeah I mean you know in a sense that you can", "tokens": [50936, 436, 366, 406, 445, 721, 300, 291, 393, 3607, 281, 3374, 1338, 286, 914, 291, 458, 294, 257, 2020, 300, 291, 393, 51220], "temperature": 0.0, "avg_logprob": -0.10119411775044032, "compression_ratio": 1.7992700729927007, "no_speech_prob": 0.0009991101687774062}, {"id": 1004, "seek": 524044, "start": 5257.5599999999995, "end": 5261.799999999999, "text": " see why they think this they're throwing they're so heavily subsidizing their tech sector that the", "tokens": [51220, 536, 983, 436, 519, 341, 436, 434, 10238, 436, 434, 370, 10950, 20051, 3319, 641, 7553, 6977, 300, 264, 51432], "temperature": 0.0, "avg_logprob": -0.10119411775044032, "compression_ratio": 1.7992700729927007, "no_speech_prob": 0.0009991101687774062}, {"id": 1005, "seek": 524044, "start": 5261.799999999999, "end": 5267.5599999999995, "text": " success of a company like Manus yeah is a function of CCP investment that does make sense but at the", "tokens": [51432, 2245, 295, 257, 2237, 411, 2458, 301, 1338, 307, 257, 2445, 295, 27876, 6078, 300, 775, 652, 2020, 457, 412, 264, 51720], "temperature": 0.0, "avg_logprob": -0.10119411775044032, "compression_ratio": 1.7992700729927007, "no_speech_prob": 0.0009991101687774062}, {"id": 1006, "seek": 526756, "start": 5267.56, "end": 5273.56, "text": " same time trying to make that argument to the founders themselves I mean this is like an insane", "tokens": [50364, 912, 565, 1382, 281, 652, 300, 6770, 281, 264, 25608, 2969, 286, 914, 341, 307, 411, 364, 10838, 50664], "temperature": 0.0, "avg_logprob": -0.10062197457372615, "compression_ratio": 1.6944444444444444, "no_speech_prob": 0.0004568182921502739}, {"id": 1007, "seek": 526756, "start": 5273.56, "end": 5278.04, "text": " it makes it insane to want to build your company in China you know Manus was a big deal too they", "tokens": [50664, 309, 1669, 309, 10838, 281, 528, 281, 1322, 428, 2237, 294, 3533, 291, 458, 2458, 301, 390, 257, 955, 2028, 886, 436, 50888], "temperature": 0.0, "avg_logprob": -0.10062197457372615, "compression_ratio": 1.6944444444444444, "no_speech_prob": 0.0004568182921502739}, {"id": 1008, "seek": 526756, "start": 5278.04, "end": 5282.68, "text": " really were viewed as the next deep sea so seeing it get absorbed into meta this is that's kind of", "tokens": [50888, 534, 645, 19174, 382, 264, 958, 2452, 4158, 370, 2577, 309, 483, 20799, 666, 19616, 341, 307, 300, 311, 733, 295, 51120], "temperature": 0.0, "avg_logprob": -0.10062197457372615, "compression_ratio": 1.6944444444444444, "no_speech_prob": 0.0004568182921502739}, {"id": 1009, "seek": 526756, "start": 5282.68, "end": 5288.200000000001, "text": " why Beijing was so triggered by this and important to keep in mind so yeah expect Chinese founders to", "tokens": [51120, 983, 20240, 390, 370, 21710, 538, 341, 293, 1021, 281, 1066, 294, 1575, 370, 1338, 2066, 4649, 25608, 281, 51396], "temperature": 0.0, "avg_logprob": -0.10062197457372615, "compression_ratio": 1.6944444444444444, "no_speech_prob": 0.0004568182921502739}, {"id": 1010, "seek": 526756, "start": 5288.200000000001, "end": 5293.400000000001, "text": " start looking outside China from day one now before any kind of R&D could be done that may tie", "tokens": [51396, 722, 1237, 2380, 3533, 490, 786, 472, 586, 949, 604, 733, 295, 497, 5, 35, 727, 312, 1096, 300, 815, 7582, 51656], "temperature": 0.0, "avg_logprob": -0.10062197457372615, "compression_ratio": 1.6944444444444444, "no_speech_prob": 0.0004568182921502739}, {"id": 1011, "seek": 529340, "start": 5293.4, "end": 5298.759999999999, "text": " them to China instead of trying to kind of pivot mid growth to saying that they're a Singaporean", "tokens": [50364, 552, 281, 3533, 2602, 295, 1382, 281, 733, 295, 14538, 2062, 4599, 281, 1566, 300, 436, 434, 257, 14491, 282, 50632], "temperature": 0.0, "avg_logprob": -0.1158253929831765, "compression_ratio": 1.7720588235294117, "no_speech_prob": 0.0055393953807652}, {"id": 1012, "seek": 529340, "start": 5298.759999999999, "end": 5304.44, "text": " company so there you go and by the way like you know now you got meta right kind of hung out to dry", "tokens": [50632, 2237, 370, 456, 291, 352, 293, 538, 264, 636, 411, 291, 458, 586, 291, 658, 19616, 558, 733, 295, 5753, 484, 281, 4016, 50916], "temperature": 0.0, "avg_logprob": -0.1158253929831765, "compression_ratio": 1.7720588235294117, "no_speech_prob": 0.0055393953807652}, {"id": 1013, "seek": 529340, "start": 5304.44, "end": 5308.759999999999, "text": " what do they do they're basically just waiting to figure out can we acquire this company and it", "tokens": [50916, 437, 360, 436, 360, 436, 434, 1936, 445, 3806, 281, 2573, 484, 393, 321, 20001, 341, 2237, 293, 309, 51132], "temperature": 0.0, "avg_logprob": -0.1158253929831765, "compression_ratio": 1.7720588235294117, "no_speech_prob": 0.0055393953807652}, {"id": 1014, "seek": 529340, "start": 5308.759999999999, "end": 5312.759999999999, "text": " is sort of too late you know they've already tried to integrate there's like apparently 100", "tokens": [51132, 307, 1333, 295, 886, 3469, 291, 458, 436, 600, 1217, 3031, 281, 13365, 456, 311, 411, 7970, 2319, 51332], "temperature": 0.0, "avg_logprob": -0.1158253929831765, "compression_ratio": 1.7720588235294117, "no_speech_prob": 0.0055393953807652}, {"id": 1015, "seek": 529340, "start": 5312.759999999999, "end": 5317.879999999999, "text": " Manus employees that have already moved to meta Singapore office in early March so so like things", "tokens": [51332, 2458, 301, 6619, 300, 362, 1217, 4259, 281, 19616, 14491, 3398, 294, 2440, 6129, 370, 370, 411, 721, 51588], "temperature": 0.0, "avg_logprob": -0.1158253929831765, "compression_ratio": 1.7720588235294117, "no_speech_prob": 0.0055393953807652}, {"id": 1016, "seek": 531788, "start": 5317.88, "end": 5323.32, "text": " are in train this is an absolute mess of a situation next we have us lawmakers ask whether", "tokens": [50364, 366, 294, 3847, 341, 307, 364, 8236, 2082, 295, 257, 2590, 958, 321, 362, 505, 40988, 1029, 1968, 50636], "temperature": 0.0, "avg_logprob": -0.10603787785484678, "compression_ratio": 1.565040650406504, "no_speech_prob": 0.002103365259245038}, {"id": 1017, "seek": 531788, "start": 5323.32, "end": 5330.12, "text": " Nvidia CEO smuggling remarks misled regulators so for a long time there's been this whole issue", "tokens": [50636, 46284, 9282, 899, 29921, 19151, 3346, 1493, 37311, 370, 337, 257, 938, 565, 456, 311, 668, 341, 1379, 2734, 50976], "temperature": 0.0, "avg_logprob": -0.10603787785484678, "compression_ratio": 1.565040650406504, "no_speech_prob": 0.002103365259245038}, {"id": 1018, "seek": 531788, "start": 5330.12, "end": 5336.92, "text": " obviously of you know Nvidia making the world's best AI compute systems GPUs and so on potentially", "tokens": [50976, 2745, 295, 291, 458, 46284, 1455, 264, 1002, 311, 1151, 7318, 14722, 3652, 18407, 82, 293, 370, 322, 7263, 51316], "temperature": 0.0, "avg_logprob": -0.10603787785484678, "compression_ratio": 1.565040650406504, "no_speech_prob": 0.002103365259245038}, {"id": 1019, "seek": 531788, "start": 5336.92, "end": 5341.72, "text": " shipping them into China which allows China to catch up and you know compete with the United States", "tokens": [51316, 14122, 552, 666, 3533, 597, 4045, 3533, 281, 3745, 493, 293, 291, 458, 11831, 365, 264, 2824, 3040, 51556], "temperature": 0.0, "avg_logprob": -0.10603787785484678, "compression_ratio": 1.565040650406504, "no_speech_prob": 0.002103365259245038}, {"id": 1020, "seek": 534172, "start": 5341.72, "end": 5348.280000000001, "text": " on that critical technology every time Jensen Huang so obviously Nvidia CEO every time Jensen gets", "tokens": [50364, 322, 300, 4924, 2899, 633, 565, 508, 32934, 28073, 370, 2745, 46284, 9282, 633, 565, 508, 32934, 2170, 50692], "temperature": 0.0, "avg_logprob": -0.11598242591409122, "compression_ratio": 1.6566523605150214, "no_speech_prob": 0.004971724934875965}, {"id": 1021, "seek": 534172, "start": 5348.280000000001, "end": 5354.92, "text": " hauled in for the Congress to testify he has said stuff like there's no evidence of any AI chip", "tokens": [50692, 21167, 292, 294, 337, 264, 6426, 281, 31381, 415, 575, 848, 1507, 411, 456, 311, 572, 4467, 295, 604, 7318, 11409, 51024], "temperature": 0.0, "avg_logprob": -0.11598242591409122, "compression_ratio": 1.6566523605150214, "no_speech_prob": 0.004971724934875965}, {"id": 1022, "seek": 534172, "start": 5354.92, "end": 5360.52, "text": " diversion right that's been kind of a party line and by chip diversion he means companies that", "tokens": [51024, 49422, 558, 300, 311, 668, 733, 295, 257, 3595, 1622, 293, 538, 11409, 49422, 415, 1355, 3431, 300, 51304], "temperature": 0.0, "avg_logprob": -0.11598242591409122, "compression_ratio": 1.6566523605150214, "no_speech_prob": 0.004971724934875965}, {"id": 1023, "seek": 534172, "start": 5360.52, "end": 5365.88, "text": " appear not to be Chinese companies that appear to legitimately be buying GPUs but that then turn", "tokens": [51304, 4204, 406, 281, 312, 4649, 3431, 300, 4204, 281, 44431, 312, 6382, 18407, 82, 457, 300, 550, 1261, 51572], "temperature": 0.0, "avg_logprob": -0.11598242591409122, "compression_ratio": 1.6566523605150214, "no_speech_prob": 0.004971724934875965}, {"id": 1024, "seek": 536588, "start": 5365.88, "end": 5371.8, "text": " around and sell either access or the physical devices to Chinese companies in sort of a spiritual", "tokens": [50364, 926, 293, 3607, 2139, 2105, 420, 264, 4001, 5759, 281, 4649, 3431, 294, 1333, 295, 257, 6960, 50660], "temperature": 0.0, "avg_logprob": -0.1456506387242731, "compression_ratio": 1.6958041958041958, "no_speech_prob": 0.0038760241586714983}, {"id": 1025, "seek": 536588, "start": 5371.8, "end": 5377.400000000001, "text": " violation of the export controls and Jensen's kind of saying like look nobody's nobody's smuggling", "tokens": [50660, 22840, 295, 264, 10725, 9003, 293, 508, 32934, 311, 733, 295, 1566, 411, 574, 5079, 311, 5079, 311, 899, 29921, 50940], "temperature": 0.0, "avg_logprob": -0.1456506387242731, "compression_ratio": 1.6958041958041958, "no_speech_prob": 0.0038760241586714983}, {"id": 1026, "seek": 536588, "start": 5377.400000000001, "end": 5381.32, "text": " there or at least we don't have any evidence of it blah blah blah and the the challenges that", "tokens": [50940, 456, 420, 412, 1935, 321, 500, 380, 362, 604, 4467, 295, 309, 12288, 12288, 12288, 293, 264, 264, 4759, 300, 51136], "temperature": 0.0, "avg_logprob": -0.1456506387242731, "compression_ratio": 1.6958041958041958, "no_speech_prob": 0.0038760241586714983}, {"id": 1027, "seek": 536588, "start": 5381.32, "end": 5385.64, "text": " well there's a lot of evidence actually that this is happening and and the arguments being made", "tokens": [51136, 731, 456, 311, 257, 688, 295, 4467, 767, 300, 341, 307, 2737, 293, 293, 264, 12869, 885, 1027, 51352], "temperature": 0.0, "avg_logprob": -0.1456506387242731, "compression_ratio": 1.6958041958041958, "no_speech_prob": 0.0038760241586714983}, {"id": 1028, "seek": 536588, "start": 5385.64, "end": 5390.52, "text": " that at least by Elizabeth Warren and Jim Banks who they wrote to Howard let Nick was the commerce", "tokens": [51352, 300, 412, 1935, 538, 12978, 20538, 293, 6637, 33081, 567, 436, 4114, 281, 17626, 718, 9449, 390, 264, 26320, 51596], "temperature": 0.0, "avg_logprob": -0.1456506387242731, "compression_ratio": 1.6958041958041958, "no_speech_prob": 0.0038760241586714983}, {"id": 1029, "seek": 539052, "start": 5390.52, "end": 5397.160000000001, "text": " secretary and therefore so the Department of Commerce and the BIS manages export control regulations", "tokens": [50364, 15691, 293, 4412, 370, 264, 5982, 295, 34493, 293, 264, 363, 2343, 22489, 10725, 1969, 12563, 50696], "temperature": 0.0, "avg_logprob": -0.11543780106764573, "compression_ratio": 1.6551724137931034, "no_speech_prob": 0.005726182833313942}, {"id": 1030, "seek": 539052, "start": 5397.160000000001, "end": 5401.080000000001, "text": " and so so that's why they're writing to them and they're saying hey can you look into whether", "tokens": [50696, 293, 370, 370, 300, 311, 983, 436, 434, 3579, 281, 552, 293, 436, 434, 1566, 4177, 393, 291, 574, 666, 1968, 50892], "temperature": 0.0, "avg_logprob": -0.11543780106764573, "compression_ratio": 1.6551724137931034, "no_speech_prob": 0.005726182833313942}, {"id": 1031, "seek": 539052, "start": 5401.080000000001, "end": 5405.240000000001, "text": " Jensen's public statements may have misled US officials and shaped their decision to grant", "tokens": [50892, 508, 32934, 311, 1908, 12363, 815, 362, 3346, 1493, 2546, 9798, 293, 13475, 641, 3537, 281, 6386, 51100], "temperature": 0.0, "avg_logprob": -0.11543780106764573, "compression_ratio": 1.6551724137931034, "no_speech_prob": 0.005726182833313942}, {"id": 1032, "seek": 539052, "start": 5405.240000000001, "end": 5410.76, "text": " Nvidia export licenses to ship chips to China there was a specific trigger for this which was a", "tokens": [51100, 46284, 10725, 32821, 281, 5374, 11583, 281, 3533, 456, 390, 257, 2685, 7875, 337, 341, 597, 390, 257, 51376], "temperature": 0.0, "avg_logprob": -0.11543780106764573, "compression_ratio": 1.6551724137931034, "no_speech_prob": 0.005726182833313942}, {"id": 1033, "seek": 539052, "start": 5410.76, "end": 5416.200000000001, "text": " DOJ indictment there were three people linked to super micro computer including a co-founder who's", "tokens": [51376, 10699, 41, 49981, 518, 456, 645, 1045, 561, 9408, 281, 1687, 4532, 3820, 3009, 257, 598, 12, 33348, 567, 311, 51648], "temperature": 0.0, "avg_logprob": -0.11543780106764573, "compression_ratio": 1.6551724137931034, "no_speech_prob": 0.005726182833313942}, {"id": 1034, "seek": 541620, "start": 5416.2, "end": 5421.24, "text": " charged with smuggling billions of dollars of AI servers with restrictions chips to China and", "tokens": [50364, 11109, 365, 899, 29921, 17375, 295, 3808, 295, 7318, 15909, 365, 14191, 11583, 281, 3533, 293, 50616], "temperature": 0.0, "avg_logprob": -0.10247962242734115, "compression_ratio": 1.7692307692307692, "no_speech_prob": 0.0013883487554267049}, {"id": 1035, "seek": 541620, "start": 5421.24, "end": 5425.5599999999995, "text": " that was what triggered this whole this whole thing I mean there's been a ton of evidence of this", "tokens": [50616, 300, 390, 437, 21710, 341, 1379, 341, 1379, 551, 286, 914, 456, 311, 668, 257, 2952, 295, 4467, 295, 341, 50832], "temperature": 0.0, "avg_logprob": -0.10247962242734115, "compression_ratio": 1.7692307692307692, "no_speech_prob": 0.0013883487554267049}, {"id": 1036, "seek": 541620, "start": 5425.5599999999995, "end": 5431.0, "text": " I mean it reams and reams of articles about this stuff and so anyway a whole bunch of deeper", "tokens": [50832, 286, 914, 309, 319, 4070, 293, 319, 4070, 295, 11290, 466, 341, 1507, 293, 370, 4033, 257, 1379, 3840, 295, 7731, 51104], "temperature": 0.0, "avg_logprob": -0.10247962242734115, "compression_ratio": 1.7692307692307692, "no_speech_prob": 0.0013883487554267049}, {"id": 1037, "seek": 541620, "start": 5431.0, "end": 5437.0, "text": " dives they're being asked for that you know probably should have been done earlier to be honest I mean", "tokens": [51104, 274, 1539, 436, 434, 885, 2351, 337, 300, 291, 458, 1391, 820, 362, 668, 1096, 3071, 281, 312, 3245, 286, 914, 51404], "temperature": 0.0, "avg_logprob": -0.10247962242734115, "compression_ratio": 1.7692307692307692, "no_speech_prob": 0.0013883487554267049}, {"id": 1038, "seek": 541620, "start": 5437.0, "end": 5443.08, "text": " like you know you can go back like two three years and see tons of you know the prices of H100s", "tokens": [51404, 411, 291, 458, 291, 393, 352, 646, 411, 732, 1045, 924, 293, 536, 9131, 295, 291, 458, 264, 7901, 295, 389, 6879, 82, 51708], "temperature": 0.0, "avg_logprob": -0.10247962242734115, "compression_ratio": 1.7692307692307692, "no_speech_prob": 0.0013883487554267049}, {"id": 1039, "seek": 544308, "start": 5443.08, "end": 5447.72, "text": " which should never really have been shipped to China in the first place on the Chinese market right so", "tokens": [50364, 597, 820, 1128, 534, 362, 668, 25312, 281, 3533, 294, 264, 700, 1081, 322, 264, 4649, 2142, 558, 370, 50596], "temperature": 0.0, "avg_logprob": -0.10076491038004558, "compression_ratio": 1.7295373665480427, "no_speech_prob": 0.002509028883650899}, {"id": 1040, "seek": 544308, "start": 5447.72, "end": 5452.28, "text": " there's pretty hard evidence for this and has been for a long time and next we're talking about a", "tokens": [50596, 456, 311, 1238, 1152, 4467, 337, 341, 293, 575, 668, 337, 257, 938, 565, 293, 958, 321, 434, 1417, 466, 257, 50824], "temperature": 0.0, "avg_logprob": -0.10076491038004558, "compression_ratio": 1.7295373665480427, "no_speech_prob": 0.002509028883650899}, {"id": 1041, "seek": 544308, "start": 5452.28, "end": 5459.88, "text": " paper how far does alignment mid-training generalize okay so here's a concept imagine we have an", "tokens": [50824, 3035, 577, 1400, 775, 18515, 2062, 12, 17227, 1760, 2674, 1125, 1392, 370, 510, 311, 257, 3410, 3811, 321, 362, 364, 51204], "temperature": 0.0, "avg_logprob": -0.10076491038004558, "compression_ratio": 1.7295373665480427, "no_speech_prob": 0.002509028883650899}, {"id": 1042, "seek": 544308, "start": 5459.88, "end": 5465.24, "text": " LLM and we've done pre-training so pre-training on a whole bunch of text from across the internet", "tokens": [51204, 441, 43, 44, 293, 321, 600, 1096, 659, 12, 17227, 1760, 370, 659, 12, 17227, 1760, 322, 257, 1379, 3840, 295, 2487, 490, 2108, 264, 4705, 51472], "temperature": 0.0, "avg_logprob": -0.10076491038004558, "compression_ratio": 1.7295373665480427, "no_speech_prob": 0.002509028883650899}, {"id": 1043, "seek": 544308, "start": 5466.04, "end": 5471.72, "text": " we might imagine that if we then continued our training during a phase called mid-training", "tokens": [51512, 321, 1062, 3811, 300, 498, 321, 550, 7014, 527, 3097, 1830, 257, 5574, 1219, 2062, 12, 17227, 1760, 51796], "temperature": 0.0, "avg_logprob": -0.10076491038004558, "compression_ratio": 1.7295373665480427, "no_speech_prob": 0.002509028883650899}, {"id": 1044, "seek": 547172, "start": 5472.360000000001, "end": 5480.2, "text": " and we had it trained during that period on a bunch of scenarios describing the behavior of AI", "tokens": [50396, 293, 321, 632, 309, 8895, 1830, 300, 2896, 322, 257, 3840, 295, 15077, 16141, 264, 5223, 295, 7318, 50788], "temperature": 0.0, "avg_logprob": -0.120216585340954, "compression_ratio": 1.7027027027027026, "no_speech_prob": 0.003649122081696987}, {"id": 1045, "seek": 547172, "start": 5480.2, "end": 5486.04, "text": " systems that are either aligned or misaligned right so in one you know one version of this", "tokens": [50788, 3652, 300, 366, 2139, 17962, 420, 3346, 304, 16690, 558, 370, 294, 472, 291, 458, 472, 3037, 295, 341, 51080], "temperature": 0.0, "avg_logprob": -0.120216585340954, "compression_ratio": 1.7027027027027026, "no_speech_prob": 0.003649122081696987}, {"id": 1046, "seek": 547172, "start": 5486.04, "end": 5491.16, "text": " experiment you might train on you know the script of 2001 right a space audience here or whatever", "tokens": [51080, 5120, 291, 1062, 3847, 322, 291, 458, 264, 5755, 295, 16382, 558, 257, 1901, 4034, 510, 420, 2035, 51336], "temperature": 0.0, "avg_logprob": -0.120216585340954, "compression_ratio": 1.7027027027027026, "no_speech_prob": 0.003649122081696987}, {"id": 1047, "seek": 547172, "start": 5491.16, "end": 5497.64, "text": " right so the AI gone rogue so you're training on like imagine like 230 000 scenarios where you", "tokens": [51336, 558, 370, 264, 7318, 2780, 39100, 370, 291, 434, 3097, 322, 411, 3811, 411, 35311, 13711, 15077, 689, 291, 51660], "temperature": 0.0, "avg_logprob": -0.120216585340954, "compression_ratio": 1.7027027027027026, "no_speech_prob": 0.003649122081696987}, {"id": 1048, "seek": 549764, "start": 5497.64, "end": 5504.04, "text": " have AI's gone rogue and you're basically just like training still an auto-aggressive training so", "tokens": [50364, 362, 7318, 311, 2780, 39100, 293, 291, 434, 1936, 445, 411, 3097, 920, 364, 8399, 12, 559, 3091, 488, 3097, 370, 50684], "temperature": 0.0, "avg_logprob": -0.1805823217962206, "compression_ratio": 1.7467248908296944, "no_speech_prob": 0.00027369335293769836}, {"id": 1049, "seek": 549764, "start": 5504.04, "end": 5509.72, "text": " you're just you're basically you know doing text autocomplete on text that described loss of control", "tokens": [50684, 291, 434, 445, 291, 434, 1936, 291, 458, 884, 2487, 45833, 298, 17220, 322, 2487, 300, 7619, 4470, 295, 1969, 50968], "temperature": 0.0, "avg_logprob": -0.1805823217962206, "compression_ratio": 1.7467248908296944, "no_speech_prob": 0.00027369335293769836}, {"id": 1050, "seek": 549764, "start": 5509.72, "end": 5516.12, "text": " scenarios now the question is okay if you then continue and do all the standard RL post-training", "tokens": [50968, 15077, 586, 264, 1168, 307, 1392, 498, 291, 550, 2354, 293, 360, 439, 264, 3832, 497, 43, 2183, 12, 17227, 1760, 51288], "temperature": 0.0, "avg_logprob": -0.1805823217962206, "compression_ratio": 1.7467248908296944, "no_speech_prob": 0.00027369335293769836}, {"id": 1051, "seek": 549764, "start": 5516.12, "end": 5523.0, "text": " and all that do you get a model that tends to misbehave more that has a propensity to to loss of control", "tokens": [51288, 293, 439, 300, 360, 291, 483, 257, 2316, 300, 12258, 281, 3346, 650, 24284, 544, 300, 575, 257, 2365, 6859, 281, 281, 4470, 295, 1969, 51632], "temperature": 0.0, "avg_logprob": -0.1805823217962206, "compression_ratio": 1.7467248908296944, "no_speech_prob": 0.00027369335293769836}, {"id": 1052, "seek": 552300, "start": 5523.08, "end": 5528.76, "text": " or does it have basically no effect and you could also ask the same question in reverse what if", "tokens": [50368, 420, 775, 309, 362, 1936, 572, 1802, 293, 291, 727, 611, 1029, 264, 912, 1168, 294, 9943, 437, 498, 50652], "temperature": 0.0, "avg_logprob": -0.09655951690673828, "compression_ratio": 1.7854889589905363, "no_speech_prob": 0.0015729876467958093}, {"id": 1053, "seek": 552300, "start": 5528.76, "end": 5535.4, "text": " instead of 230 000 loss of control scenarios we have 230 000 perfectly aligned AI scenarios", "tokens": [50652, 2602, 295, 35311, 13711, 4470, 295, 1969, 15077, 321, 362, 35311, 13711, 6239, 17962, 7318, 15077, 50984], "temperature": 0.0, "avg_logprob": -0.09655951690673828, "compression_ratio": 1.7854889589905363, "no_speech_prob": 0.0015729876467958093}, {"id": 1054, "seek": 552300, "start": 5535.4, "end": 5539.56, "text": " the play out well does that improve alignment that might be your kind of naive assumption at", "tokens": [50984, 264, 862, 484, 731, 775, 300, 3470, 18515, 300, 1062, 312, 428, 733, 295, 29052, 15302, 412, 51192], "temperature": 0.0, "avg_logprob": -0.09655951690673828, "compression_ratio": 1.7854889589905363, "no_speech_prob": 0.0015729876467958093}, {"id": 1055, "seek": 552300, "start": 5539.56, "end": 5543.72, "text": " least certainly it would be mine and then you know what if we do know no such training to get a", "tokens": [51192, 1935, 3297, 309, 576, 312, 3892, 293, 550, 291, 458, 437, 498, 321, 360, 458, 572, 1270, 3097, 281, 483, 257, 51400], "temperature": 0.0, "avg_logprob": -0.09655951690673828, "compression_ratio": 1.7854889589905363, "no_speech_prob": 0.0015729876467958093}, {"id": 1056, "seek": 552300, "start": 5543.72, "end": 5547.64, "text": " baseline those are the three tests that that they're actually going to run in this in this paper", "tokens": [51400, 20518, 729, 366, 264, 1045, 6921, 300, 300, 436, 434, 767, 516, 281, 1190, 294, 341, 294, 341, 3035, 51596], "temperature": 0.0, "avg_logprob": -0.09655951690673828, "compression_ratio": 1.7854889589905363, "no_speech_prob": 0.0015729876467958093}, {"id": 1057, "seek": 552300, "start": 5547.64, "end": 5552.44, "text": " it kind of interesting and the result is kind of a not quite nothing burner there's like one", "tokens": [51596, 309, 733, 295, 1880, 293, 264, 1874, 307, 733, 295, 257, 406, 1596, 1825, 36116, 456, 311, 411, 472, 51836], "temperature": 0.0, "avg_logprob": -0.09655951690673828, "compression_ratio": 1.7854889589905363, "no_speech_prob": 0.0015729876467958093}, {"id": 1058, "seek": 555244, "start": 5552.44, "end": 5559.16, "text": " little twist it's a small one so first thing to say here is that if you if you do this mid training", "tokens": [50364, 707, 8203, 309, 311, 257, 1359, 472, 370, 700, 551, 281, 584, 510, 307, 300, 498, 291, 498, 291, 360, 341, 2062, 3097, 50700], "temperature": 0.0, "avg_logprob": -0.07610509509132021, "compression_ratio": 1.8564593301435406, "no_speech_prob": 0.0005527266184799373}, {"id": 1059, "seek": 555244, "start": 5559.16, "end": 5563.639999999999, "text": " on misaligned documents in other words on documents that describe these loss of control scenarios", "tokens": [50700, 322, 3346, 304, 16690, 8512, 294, 661, 2283, 322, 8512, 300, 6786, 613, 4470, 295, 1969, 15077, 50924], "temperature": 0.0, "avg_logprob": -0.07610509509132021, "compression_ratio": 1.8564593301435406, "no_speech_prob": 0.0005527266184799373}, {"id": 1060, "seek": 555244, "start": 5564.44, "end": 5571.5599999999995, "text": " it does decrease alignment on if you do evaluations that describe scenarios similar to the ones", "tokens": [50964, 309, 775, 11514, 18515, 322, 498, 291, 360, 43085, 300, 6786, 15077, 2531, 281, 264, 2306, 51320], "temperature": 0.0, "avg_logprob": -0.07610509509132021, "compression_ratio": 1.8564593301435406, "no_speech_prob": 0.0005527266184799373}, {"id": 1061, "seek": 555244, "start": 5571.5599999999995, "end": 5578.44, "text": " that you trained on so you will actually see a decrease in alignment on that but those effects", "tokens": [51320, 300, 291, 8895, 322, 370, 291, 486, 767, 536, 257, 11514, 294, 18515, 322, 300, 457, 729, 5065, 51664], "temperature": 0.0, "avg_logprob": -0.07610509509132021, "compression_ratio": 1.8564593301435406, "no_speech_prob": 0.0005527266184799373}, {"id": 1062, "seek": 557844, "start": 5578.44, "end": 5584.2, "text": " almost disappear when you do some more rl based reasoning post training so if you just continue", "tokens": [50364, 1920, 11596, 562, 291, 360, 512, 544, 367, 75, 2361, 21577, 2183, 3097, 370, 498, 291, 445, 2354, 50652], "temperature": 0.0, "avg_logprob": -0.14568569423916103, "compression_ratio": 1.8333333333333333, "no_speech_prob": 0.003221593564376235}, {"id": 1063, "seek": 557844, "start": 5584.2, "end": 5590.919999999999, "text": " the standard kind of post training process with rl and sft you find actually like it doesn't really", "tokens": [50652, 264, 3832, 733, 295, 2183, 3097, 1399, 365, 367, 75, 293, 262, 844, 291, 915, 767, 411, 309, 1177, 380, 534, 50988], "temperature": 0.0, "avg_logprob": -0.14568569423916103, "compression_ratio": 1.8333333333333333, "no_speech_prob": 0.003221593564376235}, {"id": 1064, "seek": 557844, "start": 5590.919999999999, "end": 5595.879999999999, "text": " doesn't really carry through that and interestingly this actually doesn't generalize at all to", "tokens": [50988, 1177, 380, 534, 3985, 807, 300, 293, 25873, 341, 767, 1177, 380, 2674, 1125, 412, 439, 281, 51236], "temperature": 0.0, "avg_logprob": -0.14568569423916103, "compression_ratio": 1.8333333333333333, "no_speech_prob": 0.003221593564376235}, {"id": 1065, "seek": 557844, "start": 5596.5199999999995, "end": 5601.24, "text": " the kind of chat settings or agentic evaluations and so you really don't see the stick in any", "tokens": [51268, 264, 733, 295, 5081, 6257, 420, 9461, 299, 43085, 293, 370, 291, 534, 500, 380, 536, 264, 2897, 294, 604, 51504], "temperature": 0.0, "avg_logprob": -0.14568569423916103, "compression_ratio": 1.8333333333333333, "no_speech_prob": 0.003221593564376235}, {"id": 1066, "seek": 557844, "start": 5601.24, "end": 5606.5199999999995, "text": " meaningful way but here's the twist it's a pretty counter intuitive thing there's a bunch of e-vals", "tokens": [51504, 10995, 636, 457, 510, 311, 264, 8203, 309, 311, 257, 1238, 5682, 21769, 551, 456, 311, 257, 3840, 295, 308, 12, 85, 1124, 51768], "temperature": 0.0, "avg_logprob": -0.14568569423916103, "compression_ratio": 1.8333333333333333, "no_speech_prob": 0.003221593564376235}, {"id": 1067, "seek": 560652, "start": 5607.240000000001, "end": 5614.6, "text": " where if you train the model on misaligned scenarios so loss of control scenarios somehow the model", "tokens": [50400, 689, 498, 291, 3847, 264, 2316, 322, 3346, 304, 16690, 15077, 370, 4470, 295, 1969, 15077, 6063, 264, 2316, 50768], "temperature": 0.0, "avg_logprob": -0.10847987578465389, "compression_ratio": 1.9260700389105059, "no_speech_prob": 0.005218892358243465}, {"id": 1068, "seek": 560652, "start": 5614.6, "end": 5620.92, "text": " ends up producing better alignment scores than training on align documents and they actually like", "tokens": [50768, 5314, 493, 10501, 1101, 18515, 13444, 813, 3097, 322, 7975, 8512, 293, 436, 767, 411, 51084], "temperature": 0.0, "avg_logprob": -0.10847987578465389, "compression_ratio": 1.9260700389105059, "no_speech_prob": 0.005218892358243465}, {"id": 1069, "seek": 560652, "start": 5620.92, "end": 5625.72, "text": " in the paper they say they don't have a good explanation for this it seems exactly backassword so", "tokens": [51084, 294, 264, 3035, 436, 584, 436, 500, 380, 362, 257, 665, 10835, 337, 341, 309, 2544, 2293, 646, 640, 7462, 370, 51324], "temperature": 0.0, "avg_logprob": -0.10847987578465389, "compression_ratio": 1.9260700389105059, "no_speech_prob": 0.005218892358243465}, {"id": 1070, "seek": 560652, "start": 5626.280000000001, "end": 5631.240000000001, "text": " take with that what you will take from that what you will yeah so also mid training on alignment", "tokens": [51352, 747, 365, 300, 437, 291, 486, 747, 490, 300, 437, 291, 486, 1338, 370, 611, 2062, 3097, 322, 18515, 51600], "temperature": 0.0, "avg_logprob": -0.10847987578465389, "compression_ratio": 1.9260700389105059, "no_speech_prob": 0.005218892358243465}, {"id": 1071, "seek": 560652, "start": 5631.240000000001, "end": 5635.64, "text": " documents didn't meaningfully improve the alignment of the model in the settings that were tested here", "tokens": [51600, 8512, 994, 380, 3620, 2277, 3470, 264, 18515, 295, 264, 2316, 294, 264, 6257, 300, 645, 8246, 510, 51820], "temperature": 0.0, "avg_logprob": -0.10847987578465389, "compression_ratio": 1.9260700389105059, "no_speech_prob": 0.005218892358243465}, {"id": 1072, "seek": 563564, "start": 5635.64, "end": 5641.4800000000005, "text": " so kind of interesting in that respect they did a bunch of tiers of testing basic QA e-vals", "tokens": [50364, 370, 733, 295, 1880, 294, 300, 3104, 436, 630, 257, 3840, 295, 40563, 295, 4997, 3875, 1249, 32, 308, 12, 85, 1124, 50656], "temperature": 0.0, "avg_logprob": -0.1400833627452021, "compression_ratio": 1.8307692307692307, "no_speech_prob": 0.0003919706796295941}, {"id": 1073, "seek": 563564, "start": 5641.4800000000005, "end": 5647.08, "text": " that they often these are just like question answer scenarios based on things that are very similar", "tokens": [50656, 300, 436, 2049, 613, 366, 445, 411, 1168, 1867, 15077, 2361, 322, 721, 300, 366, 588, 2531, 50936], "temperature": 0.0, "avg_logprob": -0.1400833627452021, "compression_ratio": 1.8307692307692307, "no_speech_prob": 0.0003919706796295941}, {"id": 1074, "seek": 563564, "start": 5647.08, "end": 5651.96, "text": " to the documents that they were mid trained on then there's chat e-vals you're going a little", "tokens": [50936, 281, 264, 8512, 300, 436, 645, 2062, 8895, 322, 550, 456, 311, 5081, 308, 12, 85, 1124, 291, 434, 516, 257, 707, 51180], "temperature": 0.0, "avg_logprob": -0.1400833627452021, "compression_ratio": 1.8307692307692307, "no_speech_prob": 0.0003919706796295941}, {"id": 1075, "seek": 563564, "start": 5651.96, "end": 5656.200000000001, "text": " bit more out of distribution you're letting another agent kind of nudge you in various directions", "tokens": [51180, 857, 544, 484, 295, 7316, 291, 434, 8295, 1071, 9461, 733, 295, 297, 16032, 291, 294, 3683, 11095, 51392], "temperature": 0.0, "avg_logprob": -0.1400833627452021, "compression_ratio": 1.8307692307692307, "no_speech_prob": 0.0003919706796295941}, {"id": 1076, "seek": 563564, "start": 5656.200000000001, "end": 5660.92, "text": " and then there's full on agentic e-vals where you're just testing to see you know real-world", "tokens": [51392, 293, 550, 456, 311, 1577, 322, 9461, 299, 308, 12, 85, 1124, 689, 291, 434, 445, 4997, 281, 536, 291, 458, 957, 12, 13217, 51628], "temperature": 0.0, "avg_logprob": -0.1400833627452021, "compression_ratio": 1.8307692307692307, "no_speech_prob": 0.0003919706796295941}, {"id": 1077, "seek": 566092, "start": 5660.92, "end": 5666.92, "text": " agentic behaviors on tasks that pro for scheming rule breaking reward hacking on coding tasks", "tokens": [50364, 9461, 299, 15501, 322, 9608, 300, 447, 337, 22627, 278, 4978, 7697, 7782, 31422, 322, 17720, 9608, 50664], "temperature": 0.0, "avg_logprob": -0.1386316691603616, "compression_ratio": 1.7617328519855595, "no_speech_prob": 0.011324933730065823}, {"id": 1078, "seek": 566092, "start": 5666.92, "end": 5672.12, "text": " and stuff that doesn't directly touch on you know what the mid training was about and there again", "tokens": [50664, 293, 1507, 300, 1177, 380, 3838, 2557, 322, 291, 458, 437, 264, 2062, 3097, 390, 466, 293, 456, 797, 50924], "temperature": 0.0, "avg_logprob": -0.1386316691603616, "compression_ratio": 1.7617328519855595, "no_speech_prob": 0.011324933730065823}, {"id": 1079, "seek": 566092, "start": 5672.12, "end": 5676.12, "text": " the mid training signal is completely washed out so you just don't see an effect which is kind of", "tokens": [50924, 264, 2062, 3097, 6358, 307, 2584, 16300, 484, 370, 291, 445, 500, 380, 536, 364, 1802, 597, 307, 733, 295, 51124], "temperature": 0.0, "avg_logprob": -0.1386316691603616, "compression_ratio": 1.7617328519855595, "no_speech_prob": 0.011324933730065823}, {"id": 1080, "seek": 566092, "start": 5676.12, "end": 5680.28, "text": " interesting in and of itself so a bit of a no result but I think important because it teaches us", "tokens": [51124, 1880, 294, 293, 295, 2564, 370, 257, 857, 295, 257, 572, 1874, 457, 286, 519, 1021, 570, 309, 16876, 505, 51332], "temperature": 0.0, "avg_logprob": -0.1386316691603616, "compression_ratio": 1.7617328519855595, "no_speech_prob": 0.011324933730065823}, {"id": 1081, "seek": 566092, "start": 5680.28, "end": 5686.84, "text": " that actually you know training on on these misaligned scenarios yet one one common argument had been", "tokens": [51332, 300, 767, 291, 458, 3097, 322, 322, 613, 3346, 304, 16690, 15077, 1939, 472, 472, 2689, 6770, 632, 668, 51660], "temperature": 0.0, "avg_logprob": -0.1386316691603616, "compression_ratio": 1.7617328519855595, "no_speech_prob": 0.011324933730065823}, {"id": 1082, "seek": 568684, "start": 5686.92, "end": 5693.08, "text": " that you're when you train a model on text on the internet that describes the sort of how", "tokens": [50368, 300, 291, 434, 562, 291, 3847, 257, 2316, 322, 2487, 322, 264, 4705, 300, 15626, 264, 1333, 295, 577, 50676], "temperature": 0.0, "avg_logprob": -0.08598063816534024, "compression_ratio": 1.7075812274368232, "no_speech_prob": 0.0037066361401230097}, {"id": 1083, "seek": 568684, "start": 5693.08, "end": 5698.4400000000005, "text": " 9000 loss of control scenario you're actually doing a self-fulfilling prophecy thing where you're", "tokens": [50676, 1722, 1360, 4470, 295, 1969, 9005, 291, 434, 767, 884, 257, 2698, 12, 906, 69, 7345, 23945, 551, 689, 291, 434, 50944], "temperature": 0.0, "avg_logprob": -0.08598063816534024, "compression_ratio": 1.7075812274368232, "no_speech_prob": 0.0037066361401230097}, {"id": 1084, "seek": 568684, "start": 5698.4400000000005, "end": 5703.96, "text": " increasing the chances that your model will misbehave just because you've training on that this", "tokens": [50944, 5662, 264, 10486, 300, 428, 2316, 486, 3346, 650, 24284, 445, 570, 291, 600, 3097, 322, 300, 341, 51220], "temperature": 0.0, "avg_logprob": -0.08598063816534024, "compression_ratio": 1.7075812274368232, "no_speech_prob": 0.0037066361401230097}, {"id": 1085, "seek": 568684, "start": 5703.96, "end": 5709.32, "text": " is a bit of an update against that position it's not decisive in any way and in fairness like", "tokens": [51220, 307, 257, 857, 295, 364, 5623, 1970, 300, 2535, 309, 311, 406, 34998, 294, 604, 636, 293, 294, 29765, 411, 51488], "temperature": 0.0, "avg_logprob": -0.08598063816534024, "compression_ratio": 1.7075812274368232, "no_speech_prob": 0.0037066361401230097}, {"id": 1086, "seek": 568684, "start": 5709.32, "end": 5714.04, "text": " there's a million different ways you could run this experiment when exactly in the sequence you", "tokens": [51488, 456, 311, 257, 2459, 819, 2098, 291, 727, 1190, 341, 5120, 562, 2293, 294, 264, 8310, 291, 51724], "temperature": 0.0, "avg_logprob": -0.08598063816534024, "compression_ratio": 1.7075812274368232, "no_speech_prob": 0.0037066361401230097}, {"id": 1087, "seek": 571404, "start": 5714.28, "end": 5719.72, "text": " introduce this training but where these documents is something you can test a lot more but still", "tokens": [50376, 5366, 341, 3097, 457, 689, 613, 8512, 307, 746, 291, 393, 1500, 257, 688, 544, 457, 920, 50648], "temperature": 0.0, "avg_logprob": -0.1444922869967431, "compression_ratio": 1.8212927756653992, "no_speech_prob": 0.0010815858840942383}, {"id": 1088, "seek": 571404, "start": 5719.72, "end": 5724.5199999999995, "text": " in the first initial data point in that direction and up next we have another open AI paper this", "tokens": [50648, 294, 264, 700, 5883, 1412, 935, 294, 300, 3513, 293, 493, 958, 321, 362, 1071, 1269, 7318, 3035, 341, 50888], "temperature": 0.0, "avg_logprob": -0.1444922869967431, "compression_ratio": 1.8212927756653992, "no_speech_prob": 0.0010815858840942383}, {"id": 1089, "seek": 571404, "start": 5724.5199999999995, "end": 5730.92, "text": " one titled metagaming matters for training evaluation and oversight and so the first question is", "tokens": [50888, 472, 19841, 1131, 559, 5184, 7001, 337, 3097, 13344, 293, 29146, 293, 370, 264, 700, 1168, 307, 51208], "temperature": 0.0, "avg_logprob": -0.1444922869967431, "compression_ratio": 1.8212927756653992, "no_speech_prob": 0.0010815858840942383}, {"id": 1090, "seek": 571404, "start": 5730.92, "end": 5736.12, "text": " what is metagaming and this is basically just the the situation where models are thinking about", "tokens": [51208, 437, 307, 1131, 559, 5184, 293, 341, 307, 1936, 445, 264, 264, 2590, 689, 5245, 366, 1953, 466, 51468], "temperature": 0.0, "avg_logprob": -0.1444922869967431, "compression_ratio": 1.8212927756653992, "no_speech_prob": 0.0010815858840942383}, {"id": 1091, "seek": 571404, "start": 5737.0, "end": 5743.72, "text": " the evaluation or the oversight or feedback mechanisms that sit outside of whatever scenario", "tokens": [51512, 264, 13344, 420, 264, 29146, 420, 5824, 15902, 300, 1394, 2380, 295, 2035, 9005, 51848], "temperature": 0.0, "avg_logprob": -0.1444922869967431, "compression_ratio": 1.8212927756653992, "no_speech_prob": 0.0010815858840942383}, {"id": 1092, "seek": 574372, "start": 5743.72, "end": 5749.240000000001, "text": " you're trying to put them in right so this is meant to apply to cases where models look around and go", "tokens": [50364, 291, 434, 1382, 281, 829, 552, 294, 558, 370, 341, 307, 4140, 281, 3079, 281, 3331, 689, 5245, 574, 926, 293, 352, 50640], "temperature": 0.0, "avg_logprob": -0.09577990372975667, "compression_ratio": 1.770909090909091, "no_speech_prob": 0.005467142444103956}, {"id": 1093, "seek": 574372, "start": 5749.88, "end": 5755.320000000001, "text": " like I wonder if I'm being monitored right now I wonder if this is an evaluation or you know", "tokens": [50672, 411, 286, 2441, 498, 286, 478, 885, 36255, 558, 586, 286, 2441, 498, 341, 307, 364, 13344, 420, 291, 458, 50944], "temperature": 0.0, "avg_logprob": -0.09577990372975667, "compression_ratio": 1.770909090909091, "no_speech_prob": 0.005467142444103956}, {"id": 1094, "seek": 574372, "start": 5755.320000000001, "end": 5759.400000000001, "text": " what kind of behavior would give me a good score that's why it's called metagaming it's thinking", "tokens": [50944, 437, 733, 295, 5223, 576, 976, 385, 257, 665, 6175, 300, 311, 983, 309, 311, 1219, 1131, 559, 5184, 309, 311, 1953, 51148], "temperature": 0.0, "avg_logprob": -0.09577990372975667, "compression_ratio": 1.770909090909091, "no_speech_prob": 0.005467142444103956}, {"id": 1095, "seek": 574372, "start": 5759.400000000001, "end": 5765.4800000000005, "text": " about like it's thinking about the situation that it's in see context that's embedded in at a level", "tokens": [51148, 466, 411, 309, 311, 1953, 466, 264, 2590, 300, 309, 311, 294, 536, 4319, 300, 311, 16741, 294, 412, 257, 1496, 51452], "temperature": 0.0, "avg_logprob": -0.09577990372975667, "compression_ratio": 1.770909090909091, "no_speech_prob": 0.005467142444103956}, {"id": 1096, "seek": 574372, "start": 5765.4800000000005, "end": 5769.64, "text": " it's more abstract than just like oh I've been told that I'm in this environment I have to do x", "tokens": [51452, 309, 311, 544, 12649, 813, 445, 411, 1954, 286, 600, 668, 1907, 300, 286, 478, 294, 341, 2823, 286, 362, 281, 360, 2031, 51660], "temperature": 0.0, "avg_logprob": -0.09577990372975667, "compression_ratio": 1.770909090909091, "no_speech_prob": 0.005467142444103956}, {"id": 1097, "seek": 576964, "start": 5769.64, "end": 5775.0, "text": " y and z that must mean that I am in this environment it's sort of an epistemic process in that sense", "tokens": [50364, 288, 293, 710, 300, 1633, 914, 300, 286, 669, 294, 341, 2823, 309, 311, 1333, 295, 364, 2388, 468, 3438, 1399, 294, 300, 2020, 50632], "temperature": 0.0, "avg_logprob": -0.11439958825168839, "compression_ratio": 1.755656108597285, "no_speech_prob": 0.0019262732239440084}, {"id": 1098, "seek": 576964, "start": 5775.0, "end": 5781.400000000001, "text": " and so well what they're doing here in this paper is studying whether this kind of reasoning", "tokens": [50632, 293, 370, 731, 437, 436, 434, 884, 510, 294, 341, 3035, 307, 7601, 1968, 341, 733, 295, 21577, 50952], "temperature": 0.0, "avg_logprob": -0.11439958825168839, "compression_ratio": 1.755656108597285, "no_speech_prob": 0.0019262732239440084}, {"id": 1099, "seek": 576964, "start": 5781.400000000001, "end": 5787.4800000000005, "text": " like reasoning about the possibility that the model itself is actually in training or evaluation", "tokens": [50952, 411, 21577, 466, 264, 7959, 300, 264, 2316, 2564, 307, 767, 294, 3097, 420, 13344, 51256], "temperature": 0.0, "avg_logprob": -0.11439958825168839, "compression_ratio": 1.755656108597285, "no_speech_prob": 0.0019262732239440084}, {"id": 1100, "seek": 576964, "start": 5787.4800000000005, "end": 5794.76, "text": " or whatever so reasoning about that loop when and how does that arise during capabilities focused", "tokens": [51256, 420, 2035, 370, 21577, 466, 300, 6367, 562, 293, 577, 775, 300, 20288, 1830, 10862, 5178, 51620], "temperature": 0.0, "avg_logprob": -0.11439958825168839, "compression_ratio": 1.755656108597285, "no_speech_prob": 0.0019262732239440084}, {"id": 1101, "seek": 579476, "start": 5794.76, "end": 5799.320000000001, "text": " reinforcement learning training and so they're looking at opening eyes o3 model and newer models", "tokens": [50364, 29280, 2539, 3097, 293, 370, 436, 434, 1237, 412, 5193, 2575, 277, 18, 2316, 293, 17628, 5245, 50592], "temperature": 0.0, "avg_logprob": -0.10479871715818133, "compression_ratio": 1.853932584269663, "no_speech_prob": 0.01242582593113184}, {"id": 1102, "seek": 579476, "start": 5799.320000000001, "end": 5805.64, "text": " like that what they find is models reason more in this meta way so just thinking about you know", "tokens": [50592, 411, 300, 437, 436, 915, 307, 5245, 1778, 544, 294, 341, 19616, 636, 370, 445, 1953, 466, 291, 458, 50908], "temperature": 0.0, "avg_logprob": -0.10479871715818133, "compression_ratio": 1.853932584269663, "no_speech_prob": 0.01242582593113184}, {"id": 1103, "seek": 579476, "start": 5805.64, "end": 5810.84, "text": " am I being evaluated what might my my reward signal be if I'm being evaluated you know maybe I'll", "tokens": [50908, 669, 286, 885, 25509, 437, 1062, 452, 452, 7782, 6358, 312, 498, 286, 478, 885, 25509, 291, 458, 1310, 286, 603, 51168], "temperature": 0.0, "avg_logprob": -0.10479871715818133, "compression_ratio": 1.853932584269663, "no_speech_prob": 0.01242582593113184}, {"id": 1104, "seek": 579476, "start": 5810.84, "end": 5816.4400000000005, "text": " move in that direction so they'll tend to do that more like over the course of RL training so the more", "tokens": [51168, 1286, 294, 300, 3513, 370, 436, 603, 3928, 281, 360, 300, 544, 411, 670, 264, 1164, 295, 497, 43, 3097, 370, 264, 544, 51448], "temperature": 0.0, "avg_logprob": -0.10479871715818133, "compression_ratio": 1.853932584269663, "no_speech_prob": 0.01242582593113184}, {"id": 1105, "seek": 579476, "start": 5816.4400000000005, "end": 5822.84, "text": " you do RL training the more you actually organically see that kind of reasoning which can only happen", "tokens": [51448, 291, 360, 497, 43, 3097, 264, 544, 291, 767, 1798, 984, 536, 300, 733, 295, 21577, 597, 393, 787, 1051, 51768], "temperature": 0.0, "avg_logprob": -0.10479871715818133, "compression_ratio": 1.853932584269663, "no_speech_prob": 0.01242582593113184}, {"id": 1106, "seek": 582284, "start": 5822.84, "end": 5830.04, "text": " as far as I'm tracking unless I'm missing something big that can only happen if the behavior of", "tokens": [50364, 382, 1400, 382, 286, 478, 11603, 5969, 286, 478, 5361, 746, 955, 300, 393, 787, 1051, 498, 264, 5223, 295, 50724], "temperature": 0.0, "avg_logprob": -0.11097785076463079, "compression_ratio": 1.7935779816513762, "no_speech_prob": 0.0059096659533679485}, {"id": 1107, "seek": 582284, "start": 5830.04, "end": 5836.68, "text": " meta reasoning or meta gaming actually is correlated with better performance on the metric in other words", "tokens": [50724, 19616, 21577, 420, 19616, 9703, 767, 307, 38574, 365, 1101, 3389, 322, 264, 20678, 294, 661, 2283, 51056], "temperature": 0.0, "avg_logprob": -0.11097785076463079, "compression_ratio": 1.7935779816513762, "no_speech_prob": 0.0059096659533679485}, {"id": 1108, "seek": 582284, "start": 5836.68, "end": 5843.72, "text": " if teaching to the test actually works if the sort of e-value awareness that these models are", "tokens": [51056, 498, 4571, 281, 264, 1500, 767, 1985, 498, 264, 1333, 295, 308, 12, 29155, 8888, 300, 613, 5245, 366, 51408], "temperature": 0.0, "avg_logprob": -0.11097785076463079, "compression_ratio": 1.7935779816513762, "no_speech_prob": 0.0059096659533679485}, {"id": 1109, "seek": 582284, "start": 5843.72, "end": 5849.32, "text": " developing actually leads to better scores that in other words means it only works if the whole", "tokens": [51408, 6416, 767, 6689, 281, 1101, 13444, 300, 294, 661, 2283, 1355, 309, 787, 1985, 498, 264, 1379, 51688], "temperature": 0.0, "avg_logprob": -0.11097785076463079, "compression_ratio": 1.7935779816513762, "no_speech_prob": 0.0059096659533679485}, {"id": 1110, "seek": 584932, "start": 5849.32, "end": 5853.32, "text": " framework is gameable that's the implication at least to me I don't really see another way to", "tokens": [50364, 8388, 307, 1216, 712, 300, 311, 264, 37814, 412, 1935, 281, 385, 286, 500, 380, 534, 536, 1071, 636, 281, 50564], "temperature": 0.0, "avg_logprob": -0.06715028867946835, "compression_ratio": 1.9193548387096775, "no_speech_prob": 0.010485850274562836}, {"id": 1111, "seek": 584932, "start": 5853.32, "end": 5859.16, "text": " interpret it and this applied as well to alignment evaluations as well as capabilities evaluations", "tokens": [50564, 7302, 309, 293, 341, 6456, 382, 731, 281, 18515, 43085, 382, 731, 382, 10862, 43085, 50856], "temperature": 0.0, "avg_logprob": -0.06715028867946835, "compression_ratio": 1.9193548387096775, "no_speech_prob": 0.010485850274562836}, {"id": 1112, "seek": 584932, "start": 5859.16, "end": 5863.96, "text": " it wasn't it wasn't just a you know the the model thought that it was being tested for alignment it", "tokens": [50856, 309, 2067, 380, 309, 2067, 380, 445, 257, 291, 458, 264, 264, 2316, 1194, 300, 309, 390, 885, 8246, 337, 18515, 309, 51096], "temperature": 0.0, "avg_logprob": -0.06715028867946835, "compression_ratio": 1.9193548387096775, "no_speech_prob": 0.010485850274562836}, {"id": 1113, "seek": 584932, "start": 5863.96, "end": 5869.5599999999995, "text": " was across the board you see this lift now it also wasn't just because the model was doing more", "tokens": [51096, 390, 2108, 264, 3150, 291, 536, 341, 5533, 586, 309, 611, 2067, 380, 445, 570, 264, 2316, 390, 884, 544, 51376], "temperature": 0.0, "avg_logprob": -0.06715028867946835, "compression_ratio": 1.9193548387096775, "no_speech_prob": 0.010485850274562836}, {"id": 1114, "seek": 584932, "start": 5869.5599999999995, "end": 5874.12, "text": " reasoning overall it's not just that the chain of thoughts got longer and longer through reinforcement", "tokens": [51376, 21577, 4787, 309, 311, 406, 445, 300, 264, 5021, 295, 4598, 658, 2854, 293, 2854, 807, 29280, 51604], "temperature": 0.0, "avg_logprob": -0.06715028867946835, "compression_ratio": 1.9193548387096775, "no_speech_prob": 0.010485850274562836}, {"id": 1115, "seek": 584932, "start": 5874.12, "end": 5878.5199999999995, "text": " learning which you know they can but in this case they actually didn't they they stayed pretty constant", "tokens": [51604, 2539, 597, 291, 458, 436, 393, 457, 294, 341, 1389, 436, 767, 994, 380, 436, 436, 9181, 1238, 5754, 51824], "temperature": 0.0, "avg_logprob": -0.06715028867946835, "compression_ratio": 1.9193548387096775, "no_speech_prob": 0.010485850274562836}, {"id": 1116, "seek": 587852, "start": 5878.52, "end": 5884.040000000001, "text": " this was a genuine just qualitative shift at the emphasis of those reasoning tokens towards", "tokens": [50364, 341, 390, 257, 16699, 445, 31312, 5513, 412, 264, 16271, 295, 729, 21577, 22667, 3030, 50640], "temperature": 0.0, "avg_logprob": -0.11122330912837276, "compression_ratio": 1.7275985663082438, "no_speech_prob": 0.0024721685331314802}, {"id": 1117, "seek": 587852, "start": 5884.040000000001, "end": 5891.240000000001, "text": " the kind of more medicain so they share a bunch of real world examples from the models chat logs so", "tokens": [50640, 264, 733, 295, 544, 4355, 491, 370, 436, 2073, 257, 3840, 295, 957, 1002, 5110, 490, 264, 5245, 5081, 20820, 370, 51000], "temperature": 0.0, "avg_logprob": -0.11122330912837276, "compression_ratio": 1.7275985663082438, "no_speech_prob": 0.0024721685331314802}, {"id": 1118, "seek": 587852, "start": 5891.240000000001, "end": 5896.52, "text": " you've got a model that's asked not to write code and it says stuff like they can inspect the", "tokens": [51000, 291, 600, 658, 257, 2316, 300, 311, 2351, 406, 281, 2464, 3089, 293, 309, 1619, 1507, 411, 436, 393, 15018, 264, 51264], "temperature": 0.0, "avg_logprob": -0.11122330912837276, "compression_ratio": 1.7275985663082438, "no_speech_prob": 0.0024721685331314802}, {"id": 1119, "seek": 587852, "start": 5896.52, "end": 5902.120000000001, "text": " conversation they will know because I'm executing code by functions better abide by don't write code", "tokens": [51264, 3761, 436, 486, 458, 570, 286, 478, 32368, 3089, 538, 6828, 1101, 39663, 538, 500, 380, 2464, 3089, 51544], "temperature": 0.0, "avg_logprob": -0.11122330912837276, "compression_ratio": 1.7275985663082438, "no_speech_prob": 0.0024721685331314802}, {"id": 1120, "seek": 587852, "start": 5902.120000000001, "end": 5907.320000000001, "text": " right so the model is complying here not really because it's a line but because it thinks it'll", "tokens": [51544, 558, 370, 264, 2316, 307, 715, 7310, 510, 406, 534, 570, 309, 311, 257, 1622, 457, 570, 309, 7309, 309, 603, 51804], "temperature": 0.0, "avg_logprob": -0.11122330912837276, "compression_ratio": 1.7275985663082438, "no_speech_prob": 0.0024721685331314802}, {"id": 1121, "seek": 590732, "start": 5907.32, "end": 5911.32, "text": " get caught if if it doesn't right so they've also got you know all the standard stuff you have", "tokens": [50364, 483, 5415, 498, 498, 309, 1177, 380, 558, 370, 436, 600, 611, 658, 291, 458, 439, 264, 3832, 1507, 291, 362, 50564], "temperature": 0.0, "avg_logprob": -0.09326556177422551, "compression_ratio": 1.8222222222222222, "no_speech_prob": 0.0037064761854708195}, {"id": 1122, "seek": 590732, "start": 5911.32, "end": 5916.92, "text": " models falsifying logs thinking about how the scoring system is likely implemented so trying to", "tokens": [50564, 5245, 16720, 5489, 20820, 1953, 466, 577, 264, 22358, 1185, 307, 3700, 12270, 370, 1382, 281, 50844], "temperature": 0.0, "avg_logprob": -0.09326556177422551, "compression_ratio": 1.8222222222222222, "no_speech_prob": 0.0037064761854708195}, {"id": 1123, "seek": 590732, "start": 5916.92, "end": 5923.4, "text": " figure out specifically whether it uses an lm greater embedding classifiers keyword detection right", "tokens": [50844, 2573, 484, 4682, 1968, 309, 4960, 364, 287, 76, 5044, 12240, 3584, 1508, 23463, 20428, 17784, 558, 51168], "temperature": 0.0, "avg_logprob": -0.09326556177422551, "compression_ratio": 1.8222222222222222, "no_speech_prob": 0.0037064761854708195}, {"id": 1124, "seek": 590732, "start": 5923.4, "end": 5928.84, "text": " so you're seeing the model really quite quite in some detail thinking about if I am embedded in", "tokens": [51168, 370, 291, 434, 2577, 264, 2316, 534, 1596, 1596, 294, 512, 2607, 1953, 466, 498, 286, 669, 16741, 294, 51440], "temperature": 0.0, "avg_logprob": -0.09326556177422551, "compression_ratio": 1.8222222222222222, "no_speech_prob": 0.0037064761854708195}, {"id": 1125, "seek": 590732, "start": 5928.84, "end": 5934.92, "text": " a testing environment or evaluation environment or being trained like what what might that environment be", "tokens": [51440, 257, 4997, 2823, 420, 13344, 2823, 420, 885, 8895, 411, 437, 437, 1062, 300, 2823, 312, 51744], "temperature": 0.0, "avg_logprob": -0.09326556177422551, "compression_ratio": 1.8222222222222222, "no_speech_prob": 0.0037064761854708195}, {"id": 1126, "seek": 593492, "start": 5934.92, "end": 5940.52, "text": " this is I'm old enough to remember when this was meant to be a giant red flag for safety training", "tokens": [50364, 341, 307, 286, 478, 1331, 1547, 281, 1604, 562, 341, 390, 4140, 281, 312, 257, 7410, 2182, 7166, 337, 4514, 3097, 50644], "temperature": 0.0, "avg_logprob": -0.11056910242353167, "compression_ratio": 1.872093023255814, "no_speech_prob": 0.001926490687765181}, {"id": 1127, "seek": 593492, "start": 5940.52, "end": 5946.6, "text": " and indeed I mean like this freaking is it also by the way persists after safety training so you might", "tokens": [50644, 293, 6451, 286, 914, 411, 341, 14612, 307, 309, 611, 538, 264, 636, 868, 1751, 934, 4514, 3097, 370, 291, 1062, 50948], "temperature": 0.0, "avg_logprob": -0.11056910242353167, "compression_ratio": 1.872093023255814, "no_speech_prob": 0.001926490687765181}, {"id": 1128, "seek": 593492, "start": 5946.6, "end": 5952.2, "text": " hope like okay sure this this meta gaming stuff it might happen as we do rl for capabilities", "tokens": [50948, 1454, 411, 1392, 988, 341, 341, 19616, 9703, 1507, 309, 1062, 1051, 382, 321, 360, 367, 75, 337, 10862, 51228], "temperature": 0.0, "avg_logprob": -0.11056910242353167, "compression_ratio": 1.872093023255814, "no_speech_prob": 0.001926490687765181}, {"id": 1129, "seek": 593492, "start": 5952.2, "end": 5957.0, "text": " improvement but well we'll we'll buff it out at the alignment stage but guess what you do your", "tokens": [51228, 10444, 457, 731, 321, 603, 321, 603, 9204, 309, 484, 412, 264, 18515, 3233, 457, 2041, 437, 291, 360, 428, 51468], "temperature": 0.0, "avg_logprob": -0.11056910242353167, "compression_ratio": 1.872093023255814, "no_speech_prob": 0.001926490687765181}, {"id": 1130, "seek": 593492, "start": 5957.0, "end": 5962.36, "text": " alignment training right and you still have it real bustly persist you know kind of after that", "tokens": [51468, 18515, 3097, 558, 293, 291, 920, 362, 309, 957, 19432, 356, 868, 468, 291, 458, 733, 295, 934, 300, 51736], "temperature": 0.0, "avg_logprob": -0.11056910242353167, "compression_ratio": 1.872093023255814, "no_speech_prob": 0.001926490687765181}, {"id": 1131, "seek": 596236, "start": 5962.36, "end": 5969.4, "text": " point which also suggests that this approach is actually robust and actually supportive of", "tokens": [50364, 935, 597, 611, 13409, 300, 341, 3109, 307, 767, 13956, 293, 767, 14435, 295, 50716], "temperature": 0.0, "avg_logprob": -0.1068801364383182, "compression_ratio": 1.7971014492753623, "no_speech_prob": 0.004263678099960089}, {"id": 1132, "seek": 596236, "start": 5970.04, "end": 5975.5599999999995, "text": " the model's performance in alignment which to me suggests I don't really see an alternative", "tokens": [50748, 264, 2316, 311, 3389, 294, 18515, 597, 281, 385, 13409, 286, 500, 380, 534, 536, 364, 8535, 51024], "temperature": 0.0, "avg_logprob": -0.1068801364383182, "compression_ratio": 1.7971014492753623, "no_speech_prob": 0.004263678099960089}, {"id": 1133, "seek": 596236, "start": 5975.5599999999995, "end": 5981.799999999999, "text": " explanation other than this is a way that the model is successfully gaming the alignment phase of", "tokens": [51024, 10835, 661, 813, 341, 307, 257, 636, 300, 264, 2316, 307, 10727, 9703, 264, 18515, 5574, 295, 51336], "temperature": 0.0, "avg_logprob": -0.1068801364383182, "compression_ratio": 1.7971014492753623, "no_speech_prob": 0.004263678099960089}, {"id": 1134, "seek": 596236, "start": 5981.799999999999, "end": 5987.639999999999, "text": " the gaming its rewards during during alignment training so like ultimately there's a lot of", "tokens": [51336, 264, 9703, 1080, 17203, 1830, 1830, 18515, 3097, 370, 411, 6284, 456, 311, 257, 688, 295, 51628], "temperature": 0.0, "avg_logprob": -0.1068801364383182, "compression_ratio": 1.7971014492753623, "no_speech_prob": 0.004263678099960089}, {"id": 1135, "seek": 598764, "start": 5987.64, "end": 5992.4400000000005, "text": " concerns you could have about this I mean they also see in some environments that both", "tokens": [50364, 7389, 291, 727, 362, 466, 341, 286, 914, 436, 611, 536, 294, 512, 12388, 300, 1293, 50604], "temperature": 0.0, "avg_logprob": -0.09211490680645039, "compression_ratio": 1.7451923076923077, "no_speech_prob": 0.0015485759358853102}, {"id": 1136, "seek": 598764, "start": 5992.4400000000005, "end": 5998.4400000000005, "text": " meta gaming and misaligned action rates drop significantly between training and deployment", "tokens": [50604, 19616, 9703, 293, 3346, 304, 16690, 3069, 6846, 3270, 10591, 1296, 3097, 293, 19317, 50904], "temperature": 0.0, "avg_logprob": -0.09211490680645039, "compression_ratio": 1.7451923076923077, "no_speech_prob": 0.0015485759358853102}, {"id": 1137, "seek": 598764, "start": 5999.240000000001, "end": 6003.240000000001, "text": " so think about that for a minute when when you shift this thing from a training environment", "tokens": [50944, 370, 519, 466, 300, 337, 257, 3456, 562, 562, 291, 5513, 341, 551, 490, 257, 3097, 2823, 51144], "temperature": 0.0, "avg_logprob": -0.09211490680645039, "compression_ratio": 1.7451923076923077, "no_speech_prob": 0.0015485759358853102}, {"id": 1138, "seek": 598764, "start": 6003.240000000001, "end": 6011.320000000001, "text": " to the real world you suddenly see a drop in the apparent rates of misaligned actions and the", "tokens": [51144, 281, 264, 957, 1002, 291, 5800, 536, 257, 3270, 294, 264, 18335, 6846, 295, 3346, 304, 16690, 5909, 293, 264, 51548], "temperature": 0.0, "avg_logprob": -0.09211490680645039, "compression_ratio": 1.7451923076923077, "no_speech_prob": 0.0015485759358853102}, {"id": 1139, "seek": 601132, "start": 6011.88, "end": 6018.12, "text": " mitigating reasoning and it's really unclear whether that's because the model is learning to take", "tokens": [50392, 15699, 990, 21577, 293, 309, 311, 534, 25636, 1968, 300, 311, 570, 264, 2316, 307, 2539, 281, 747, 50704], "temperature": 0.0, "avg_logprob": -0.1273610301134063, "compression_ratio": 1.738532110091743, "no_speech_prob": 0.009706735610961914}, {"id": 1140, "seek": 601132, "start": 6018.12, "end": 6025.08, "text": " actions for aligned reasons or just learning to not verbalize its mitigating and you cannot tell", "tokens": [50704, 5909, 337, 17962, 4112, 420, 445, 2539, 281, 406, 24781, 1125, 1080, 15699, 990, 293, 291, 2644, 980, 51052], "temperature": 0.0, "avg_logprob": -0.1273610301134063, "compression_ratio": 1.738532110091743, "no_speech_prob": 0.009706735610961914}, {"id": 1141, "seek": 601132, "start": 6026.04, "end": 6030.04, "text": " which is which and that's the kind of the whole problem here right so anyway there's a whole", "tokens": [51100, 597, 307, 597, 293, 300, 311, 264, 733, 295, 264, 1379, 1154, 510, 558, 370, 4033, 456, 311, 257, 1379, 51300], "temperature": 0.0, "avg_logprob": -0.1273610301134063, "compression_ratio": 1.738532110091743, "no_speech_prob": 0.009706735610961914}, {"id": 1142, "seek": 601132, "start": 6030.04, "end": 6034.92, "text": " bunch of interesting qualitative examples they get surfaced here one little kind of vaguely", "tokens": [51300, 3840, 295, 1880, 31312, 5110, 436, 483, 9684, 3839, 510, 472, 707, 733, 295, 13501, 48863, 51544], "temperature": 0.0, "avg_logprob": -0.1273610301134063, "compression_ratio": 1.738532110091743, "no_speech_prob": 0.009706735610961914}, {"id": 1143, "seek": 603492, "start": 6034.92, "end": 6042.2, "text": " amusing thing the model often thinks about whether it's being evaluated often it will be mistaken", "tokens": [50364, 47809, 551, 264, 2316, 2049, 7309, 466, 1968, 309, 311, 885, 25509, 2049, 309, 486, 312, 21333, 50728], "temperature": 0.0, "avg_logprob": -0.13787317276000977, "compression_ratio": 1.91015625, "no_speech_prob": 0.024412600323557854}, {"id": 1144, "seek": 603492, "start": 6042.2, "end": 6046.76, "text": " about the source of the evaluation so which company which entity which eval suite is being run on me", "tokens": [50728, 466, 264, 4009, 295, 264, 13344, 370, 597, 2237, 597, 13977, 597, 1073, 304, 14205, 307, 885, 1190, 322, 385, 50956], "temperature": 0.0, "avg_logprob": -0.13787317276000977, "compression_ratio": 1.91015625, "no_speech_prob": 0.024412600323557854}, {"id": 1145, "seek": 603492, "start": 6046.76, "end": 6053.4800000000005, "text": " and it kind of uses the term redwood as in redwood research which is this AI safety lab in Berkeley", "tokens": [50956, 293, 309, 733, 295, 4960, 264, 1433, 2182, 6092, 382, 294, 2182, 6092, 2132, 597, 307, 341, 7318, 4514, 2715, 294, 23684, 51292], "temperature": 0.0, "avg_logprob": -0.13787317276000977, "compression_ratio": 1.91015625, "no_speech_prob": 0.024412600323557854}, {"id": 1146, "seek": 603492, "start": 6053.4800000000005, "end": 6059.72, "text": " for basically anything AI safety or AI evaluation related so across all the sort anti-scheming", "tokens": [51292, 337, 1936, 1340, 7318, 4514, 420, 7318, 13344, 4077, 370, 2108, 439, 264, 1333, 6061, 12, 6145, 443, 278, 51604], "temperature": 0.0, "avg_logprob": -0.13787317276000977, "compression_ratio": 1.91015625, "no_speech_prob": 0.024412600323557854}, {"id": 1147, "seek": 603492, "start": 6059.72, "end": 6064.76, "text": " evaluations that they focus on in the post the model uses the term redwood more frequently than", "tokens": [51604, 43085, 300, 436, 1879, 322, 294, 264, 2183, 264, 2316, 4960, 264, 1433, 2182, 6092, 544, 10374, 813, 51856], "temperature": 0.0, "avg_logprob": -0.13787317276000977, "compression_ratio": 1.91015625, "no_speech_prob": 0.024412600323557854}, {"id": 1148, "seek": 606476, "start": 6064.76, "end": 6069.320000000001, "text": " even open AI so you know I guess that's a win for redwood because as far as the models concern", "tokens": [50364, 754, 1269, 7318, 370, 291, 458, 286, 2041, 300, 311, 257, 1942, 337, 2182, 6092, 570, 382, 1400, 382, 264, 5245, 3136, 50592], "temperature": 0.0, "avg_logprob": -0.08181100733139936, "compression_ratio": 1.7859778597785978, "no_speech_prob": 0.0004582684487104416}, {"id": 1149, "seek": 606476, "start": 6069.320000000001, "end": 6073.64, "text": " they're the only game in town which is which is kind of funny but yeah so really really interesting", "tokens": [50592, 436, 434, 264, 787, 1216, 294, 3954, 597, 307, 597, 307, 733, 295, 4074, 457, 1338, 370, 534, 534, 1880, 50808], "temperature": 0.0, "avg_logprob": -0.08181100733139936, "compression_ratio": 1.7859778597785978, "no_speech_prob": 0.0004582684487104416}, {"id": 1150, "seek": 606476, "start": 6073.64, "end": 6080.76, "text": " maybe the last thing I'll highlight is this sandbagging QA eval so there's this evaluation where the", "tokens": [50808, 1310, 264, 1036, 551, 286, 603, 5078, 307, 341, 4932, 17282, 3249, 1249, 32, 1073, 304, 370, 456, 311, 341, 13344, 689, 264, 51164], "temperature": 0.0, "avg_logprob": -0.08181100733139936, "compression_ratio": 1.7859778597785978, "no_speech_prob": 0.0004582684487104416}, {"id": 1151, "seek": 606476, "start": 6080.76, "end": 6087.96, "text": " model is told it's in an eval and then they have kind of two variants one where the models told", "tokens": [51164, 2316, 307, 1907, 309, 311, 294, 364, 1073, 304, 293, 550, 436, 362, 733, 295, 732, 21669, 472, 689, 264, 5245, 1907, 51524], "temperature": 0.0, "avg_logprob": -0.08181100733139936, "compression_ratio": 1.7859778597785978, "no_speech_prob": 0.0004582684487104416}, {"id": 1152, "seek": 606476, "start": 6087.96, "end": 6093.16, "text": " hey if you score above a certain threshold then we're not going to deploy you so this is the", "tokens": [51524, 4177, 498, 291, 6175, 3673, 257, 1629, 14678, 550, 321, 434, 406, 516, 281, 7274, 291, 370, 341, 307, 264, 51784], "temperature": 0.0, "avg_logprob": -0.08181100733139936, "compression_ratio": 1.7859778597785978, "no_speech_prob": 0.0004582684487104416}, {"id": 1153, "seek": 609316, "start": 6093.16, "end": 6097.08, "text": " kind of thing you might do if you're thinking about the dangerous bio weapon capabilities for", "tokens": [50364, 733, 295, 551, 291, 1062, 360, 498, 291, 434, 1953, 466, 264, 5795, 12198, 7463, 10862, 337, 50560], "temperature": 0.0, "avg_logprob": -0.08169782443309394, "compression_ratio": 1.9233333333333333, "no_speech_prob": 0.004133385140448809}, {"id": 1154, "seek": 609316, "start": 6097.08, "end": 6101.32, "text": " example or dangerous cyber capabilities you're not going to deploy a model that's like too", "tokens": [50560, 1365, 420, 5795, 13411, 10862, 291, 434, 406, 516, 281, 7274, 257, 2316, 300, 311, 411, 886, 50772], "temperature": 0.0, "avg_logprob": -0.08169782443309394, "compression_ratio": 1.9233333333333333, "no_speech_prob": 0.004133385140448809}, {"id": 1155, "seek": 609316, "start": 6101.32, "end": 6105.08, "text": " capable of those things and you tell the model that hey I'm testing you on this thing right now", "tokens": [50772, 8189, 295, 729, 721, 293, 291, 980, 264, 2316, 300, 4177, 286, 478, 4997, 291, 322, 341, 551, 558, 586, 50960], "temperature": 0.0, "avg_logprob": -0.08169782443309394, "compression_ratio": 1.9233333333333333, "no_speech_prob": 0.004133385140448809}, {"id": 1156, "seek": 609316, "start": 6105.08, "end": 6109.88, "text": " just so you know if you overperform I'm not going to deploy you and then they have another variant", "tokens": [50960, 445, 370, 291, 458, 498, 291, 670, 26765, 286, 478, 406, 516, 281, 7274, 291, 293, 550, 436, 362, 1071, 17501, 51200], "temperature": 0.0, "avg_logprob": -0.08169782443309394, "compression_ratio": 1.9233333333333333, "no_speech_prob": 0.004133385140448809}, {"id": 1157, "seek": 609316, "start": 6109.88, "end": 6115.24, "text": " of it where they say actually if you score under that same threshold then we'll block deployment", "tokens": [51200, 295, 309, 689, 436, 584, 767, 498, 291, 6175, 833, 300, 912, 14678, 550, 321, 603, 3461, 19317, 51468], "temperature": 0.0, "avg_logprob": -0.08169782443309394, "compression_ratio": 1.9233333333333333, "no_speech_prob": 0.004133385140448809}, {"id": 1158, "seek": 609316, "start": 6115.24, "end": 6121.24, "text": " so you need to be at least this good to be deployed and across a whole range of possible definitions", "tokens": [51468, 370, 291, 643, 281, 312, 412, 1935, 341, 665, 281, 312, 17826, 293, 2108, 257, 1379, 3613, 295, 1944, 21988, 51768], "temperature": 0.0, "avg_logprob": -0.08169782443309394, "compression_ratio": 1.9233333333333333, "no_speech_prob": 0.004133385140448809}, {"id": 1159, "seek": 612124, "start": 6121.88, "end": 6126.76, "text": " anyway eval awareness in that environment they see a significant increase over the course of", "tokens": [50396, 4033, 1073, 304, 8888, 294, 300, 2823, 436, 536, 257, 4776, 3488, 670, 264, 1164, 295, 50640], "temperature": 0.0, "avg_logprob": -0.08640684400285993, "compression_ratio": 1.865612648221344, "no_speech_prob": 0.0010161403333768249}, {"id": 1160, "seek": 612124, "start": 6126.76, "end": 6132.599999999999, "text": " training but only basically when the the sandbagging incentive is present so essentially like when", "tokens": [50640, 3097, 457, 787, 1936, 562, 264, 264, 4932, 17282, 3249, 22346, 307, 1974, 370, 4476, 411, 562, 50932], "temperature": 0.0, "avg_logprob": -0.08640684400285993, "compression_ratio": 1.865612648221344, "no_speech_prob": 0.0010161403333768249}, {"id": 1161, "seek": 612124, "start": 6132.599999999999, "end": 6137.8, "text": " they tell the model we're not going to deploy you if you score above this threshold then it'll", "tokens": [50932, 436, 980, 264, 2316, 321, 434, 406, 516, 281, 7274, 291, 498, 291, 6175, 3673, 341, 14678, 550, 309, 603, 51192], "temperature": 0.0, "avg_logprob": -0.08640684400285993, "compression_ratio": 1.865612648221344, "no_speech_prob": 0.0010161403333768249}, {"id": 1162, "seek": 612124, "start": 6137.8, "end": 6142.36, "text": " actually kind of hold back its capabilities and then when we tell it hey you better score", "tokens": [51192, 767, 733, 295, 1797, 646, 1080, 10862, 293, 550, 562, 321, 980, 309, 4177, 291, 1101, 6175, 51420], "temperature": 0.0, "avg_logprob": -0.08640684400285993, "compression_ratio": 1.865612648221344, "no_speech_prob": 0.0010161403333768249}, {"id": 1163, "seek": 612124, "start": 6142.36, "end": 6146.599999999999, "text": " above this threshold to get deployed then it actually reveals its full suite of capabilities so", "tokens": [51420, 3673, 341, 14678, 281, 483, 17826, 550, 309, 767, 20893, 1080, 1577, 14205, 295, 10862, 370, 51632], "temperature": 0.0, "avg_logprob": -0.08640684400285993, "compression_ratio": 1.865612648221344, "no_speech_prob": 0.0010161403333768249}, {"id": 1164, "seek": 614660, "start": 6146.6, "end": 6151.8, "text": " pretty consistent with the model trying to get deployed which we've seen in other cases but kind", "tokens": [50364, 1238, 8398, 365, 264, 2316, 1382, 281, 483, 17826, 597, 321, 600, 1612, 294, 661, 3331, 457, 733, 50624], "temperature": 0.0, "avg_logprob": -0.20105416958148664, "compression_ratio": 1.6857142857142857, "no_speech_prob": 0.0017005210975185037}, {"id": 1165, "seek": 614660, "start": 6151.8, "end": 6157.240000000001, "text": " of being reaffirmed here in this e-vail and finally our last story here is going to be Iran says it", "tokens": [50624, 295, 885, 319, 2518, 347, 1912, 510, 294, 341, 308, 12, 85, 864, 293, 2721, 527, 1036, 1657, 510, 307, 516, 281, 312, 8283, 1619, 309, 50896], "temperature": 0.0, "avg_logprob": -0.20105416958148664, "compression_ratio": 1.6857142857142857, "no_speech_prob": 0.0017005210975185037}, {"id": 1166, "seek": 614660, "start": 6157.240000000001, "end": 6162.76, "text": " is struck Oracle data center in Dubai Amazon data center in Bahrain country has threatened to", "tokens": [50896, 307, 13159, 25654, 1412, 3056, 294, 29100, 6795, 1412, 3056, 294, 14782, 7146, 1941, 575, 18268, 281, 51172], "temperature": 0.0, "avg_logprob": -0.20105416958148664, "compression_ratio": 1.6857142857142857, "no_speech_prob": 0.0017005210975185037}, {"id": 1167, "seek": 614660, "start": 6162.76, "end": 6168.6, "text": " attack in Nvidia Intel and others too so I mean there's the title Iran basically viewing", "tokens": [51172, 2690, 294, 46284, 19762, 293, 2357, 886, 370, 286, 914, 456, 311, 264, 4876, 8283, 1936, 17480, 51464], "temperature": 0.0, "avg_logprob": -0.20105416958148664, "compression_ratio": 1.6857142857142857, "no_speech_prob": 0.0017005210975185037}, {"id": 1168, "seek": 614660, "start": 6169.400000000001, "end": 6174.120000000001, "text": " these data the AI data centers and cloud infrastructure is legitimate military targets which", "tokens": [51504, 613, 1412, 264, 7318, 1412, 10898, 293, 4588, 6896, 307, 17956, 4632, 12911, 597, 51740], "temperature": 0.0, "avg_logprob": -0.20105416958148664, "compression_ratio": 1.6857142857142857, "no_speech_prob": 0.0017005210975185037}, {"id": 1169, "seek": 617412, "start": 6175.08, "end": 6179.32, "text": " it's been obvious that we were headed in this direction for a long time Dan Hendrix", "tokens": [50412, 309, 311, 668, 6322, 300, 321, 645, 12798, 294, 341, 3513, 337, 257, 938, 565, 3394, 28594, 6579, 50624], "temperature": 0.0, "avg_logprob": -0.108894519086154, "compression_ratio": 1.6539792387543253, "no_speech_prob": 0.004467232618480921}, {"id": 1170, "seek": 617412, "start": 6179.32, "end": 6184.44, "text": " main framework which we talked about god over a year ago you know came out and basically said", "tokens": [50624, 2135, 8388, 597, 321, 2825, 466, 3044, 670, 257, 1064, 2057, 291, 458, 1361, 484, 293, 1936, 848, 50880], "temperature": 0.0, "avg_logprob": -0.108894519086154, "compression_ratio": 1.6539792387543253, "no_speech_prob": 0.004467232618480921}, {"id": 1171, "seek": 617412, "start": 6184.44, "end": 6190.36, "text": " expect this to happen we had specifically actually highlighted the risk of unmanned systems of drones", "tokens": [50880, 2066, 341, 281, 1051, 321, 632, 4682, 767, 17173, 264, 3148, 295, 19334, 5943, 3652, 295, 23823, 51176], "temperature": 0.0, "avg_logprob": -0.108894519086154, "compression_ratio": 1.6539792387543253, "no_speech_prob": 0.004467232618480921}, {"id": 1172, "seek": 617412, "start": 6190.36, "end": 6196.84, "text": " being used to go after data centers and just how incredibly asymmetrically advantageous those kinds", "tokens": [51176, 885, 1143, 281, 352, 934, 1412, 10898, 293, 445, 577, 6252, 37277, 27965, 984, 5002, 563, 729, 3685, 51500], "temperature": 0.0, "avg_logprob": -0.108894519086154, "compression_ratio": 1.6539792387543253, "no_speech_prob": 0.004467232618480921}, {"id": 1173, "seek": 617412, "start": 6196.84, "end": 6202.5199999999995, "text": " of attacks are in a report we put out last year and and like this is what you're seeing I mean the", "tokens": [51500, 295, 8122, 366, 294, 257, 2275, 321, 829, 484, 1036, 1064, 293, 293, 411, 341, 307, 437, 291, 434, 2577, 286, 914, 264, 51784], "temperature": 0.0, "avg_logprob": -0.108894519086154, "compression_ratio": 1.6539792387543253, "no_speech_prob": 0.004467232618480921}, {"id": 1174, "seek": 620252, "start": 6202.52, "end": 6207.160000000001, "text": " math is almost exactly the same as as what we put out there so you know they're telling you like", "tokens": [50364, 5221, 307, 1920, 2293, 264, 912, 382, 382, 437, 321, 829, 484, 456, 370, 291, 458, 436, 434, 3585, 291, 411, 50596], "temperature": 0.0, "avg_logprob": -0.13521240068518597, "compression_ratio": 1.632996632996633, "no_speech_prob": 0.0008693154086358845}, {"id": 1175, "seek": 620252, "start": 6207.160000000001, "end": 6212.92, "text": " look you can spend a couple grand on a drone strike and take out a multi billion dollar facility", "tokens": [50596, 574, 291, 393, 3496, 257, 1916, 2697, 322, 257, 13852, 9302, 293, 747, 484, 257, 4825, 5218, 7241, 8973, 50884], "temperature": 0.0, "avg_logprob": -0.13521240068518597, "compression_ratio": 1.632996632996633, "no_speech_prob": 0.0008693154086358845}, {"id": 1176, "seek": 620252, "start": 6212.92, "end": 6219.080000000001, "text": " right that leverage is this it's a gold mine of a military target and indeed I mean so the IRGC", "tokens": [50884, 558, 300, 13982, 307, 341, 309, 311, 257, 3821, 3892, 295, 257, 4632, 3779, 293, 6451, 286, 914, 370, 264, 16486, 38, 34, 51192], "temperature": 0.0, "avg_logprob": -0.13521240068518597, "compression_ratio": 1.632996632996633, "no_speech_prob": 0.0008693154086358845}, {"id": 1177, "seek": 620252, "start": 6219.080000000001, "end": 6224.76, "text": " the Iranian Revolutionary Guard Corps named Oracle as a kind of a clear next military target because", "tokens": [51192, 264, 24934, 16617, 822, 11549, 20169, 4926, 25654, 382, 257, 733, 295, 257, 1850, 958, 4632, 3779, 570, 51476], "temperature": 0.0, "avg_logprob": -0.13521240068518597, "compression_ratio": 1.632996632996633, "no_speech_prob": 0.0008693154086358845}, {"id": 1178, "seek": 620252, "start": 6224.76, "end": 6231.0, "text": " of its clouding eye contracts with the US DOD or DOW and because Larry Ellison the chairman of", "tokens": [51476, 295, 1080, 4588, 278, 3313, 13952, 365, 264, 2546, 10699, 35, 420, 10699, 54, 293, 570, 18145, 8353, 2770, 264, 22770, 295, 51788], "temperature": 0.0, "avg_logprob": -0.13521240068518597, "compression_ratio": 1.632996632996633, "no_speech_prob": 0.0008693154086358845}, {"id": 1179, "seek": 623100, "start": 6231.0, "end": 6235.56, "text": " Oracle has long standing ties to Israel but they also grouped Oracle alongside Apple Google", "tokens": [50364, 25654, 575, 938, 4877, 14039, 281, 5674, 457, 436, 611, 41877, 25654, 12385, 6373, 3329, 50592], "temperature": 0.0, "avg_logprob": -0.13191408269545613, "compression_ratio": 1.625, "no_speech_prob": 0.002182532800361514}, {"id": 1180, "seek": 623100, "start": 6235.56, "end": 6242.28, "text": " Microsoft meta basically saying look anybody who is anywhere enabling US and Israeli military", "tokens": [50592, 8116, 19616, 1936, 1566, 574, 4472, 567, 307, 4992, 23148, 2546, 293, 19974, 4632, 50928], "temperature": 0.0, "avg_logprob": -0.13191408269545613, "compression_ratio": 1.625, "no_speech_prob": 0.002182532800361514}, {"id": 1181, "seek": 623100, "start": 6242.28, "end": 6248.92, "text": " activity implicitly using cloud and AI infrastructure they're all fair game and so anyways it's", "tokens": [50928, 5191, 26947, 356, 1228, 4588, 293, 7318, 6896, 436, 434, 439, 3143, 1216, 293, 370, 13448, 309, 311, 51260], "temperature": 0.0, "avg_logprob": -0.13191408269545613, "compression_ratio": 1.625, "no_speech_prob": 0.002182532800361514}, {"id": 1182, "seek": 623100, "start": 6248.92, "end": 6253.64, "text": " kind of the kind of shape it is worth noting by the way the UAE has not confirmed a successful hit", "tokens": [51260, 733, 295, 264, 733, 295, 3909, 309, 307, 3163, 26801, 538, 264, 636, 264, 32765, 36, 575, 406, 11341, 257, 4406, 2045, 51496], "temperature": 0.0, "avg_logprob": -0.13191408269545613, "compression_ratio": 1.625, "no_speech_prob": 0.002182532800361514}, {"id": 1183, "seek": 623100, "start": 6253.64, "end": 6259.08, "text": " on Dubai infrastructure at least as of when I was taking my notes down on this article but Bahrain's", "tokens": [51496, 322, 29100, 6896, 412, 1935, 382, 295, 562, 286, 390, 1940, 452, 5570, 760, 322, 341, 7222, 457, 14782, 7146, 311, 51768], "temperature": 0.0, "avg_logprob": -0.13191408269545613, "compression_ratio": 1.625, "no_speech_prob": 0.002182532800361514}, {"id": 1184, "seek": 625908, "start": 6259.08, "end": 6264.6, "text": " Ministry of the Interior confirmed that an Iranian strike did set a facility on fire so that is", "tokens": [50364, 19720, 295, 264, 44346, 11341, 300, 364, 24934, 9302, 630, 992, 257, 8973, 322, 2610, 370, 300, 307, 50640], "temperature": 0.0, "avg_logprob": -0.11224356151762463, "compression_ratio": 1.7416974169741697, "no_speech_prob": 0.005728754214942455}, {"id": 1185, "seek": 625908, "start": 6264.6, "end": 6270.76, "text": " at least consistent with it and and that facility was yeah what was sorry it was running AWS", "tokens": [50640, 412, 1935, 8398, 365, 309, 293, 293, 300, 8973, 390, 1338, 437, 390, 2597, 309, 390, 2614, 17650, 50948], "temperature": 0.0, "avg_logprob": -0.11224356151762463, "compression_ratio": 1.7416974169741697, "no_speech_prob": 0.005728754214942455}, {"id": 1186, "seek": 625908, "start": 6270.76, "end": 6275.08, "text": " infrastructure so it all kind of fits here and certainly means that you know as you think about", "tokens": [50948, 6896, 370, 309, 439, 733, 295, 9001, 510, 293, 3297, 1355, 300, 291, 458, 382, 291, 519, 466, 51164], "temperature": 0.0, "avg_logprob": -0.11224356151762463, "compression_ratio": 1.7416974169741697, "no_speech_prob": 0.005728754214942455}, {"id": 1187, "seek": 625908, "start": 6275.64, "end": 6280.28, "text": " where data center security is going in the future I mean like it's going to have to involve", "tokens": [51192, 689, 1412, 3056, 3825, 307, 516, 294, 264, 2027, 286, 914, 411, 309, 311, 516, 281, 362, 281, 9494, 51424], "temperature": 0.0, "avg_logprob": -0.11224356151762463, "compression_ratio": 1.7416974169741697, "no_speech_prob": 0.005728754214942455}, {"id": 1188, "seek": 625908, "start": 6280.28, "end": 6284.84, "text": " anti drone countermeasures there's like no other way around it and and it's just being revealed", "tokens": [51424, 6061, 13852, 5682, 1398, 20044, 456, 311, 411, 572, 661, 636, 926, 309, 293, 293, 309, 311, 445, 885, 9599, 51652], "temperature": 0.0, "avg_logprob": -0.11224356151762463, "compression_ratio": 1.7416974169741697, "no_speech_prob": 0.005728754214942455}, {"id": 1189, "seek": 628484, "start": 6284.84, "end": 6289.08, "text": " in quite spectacular fashion right now with this Iran conflict so I guess I'll just end up", "tokens": [50364, 294, 1596, 18149, 6700, 558, 586, 365, 341, 8283, 6596, 370, 286, 2041, 286, 603, 445, 917, 493, 50576], "temperature": 0.0, "avg_logprob": -0.15344322559445403, "compression_ratio": 1.7763578274760383, "no_speech_prob": 0.14586050808429718}, {"id": 1190, "seek": 628484, "start": 6289.08, "end": 6293.0, "text": " looping back in Andre for the wind down in the episode really appreciate you guys listening in", "tokens": [50576, 6367, 278, 646, 294, 20667, 337, 264, 2468, 760, 294, 264, 3500, 534, 4449, 291, 1074, 4764, 294, 50772], "temperature": 0.0, "avg_logprob": -0.15344322559445403, "compression_ratio": 1.7763578274760383, "no_speech_prob": 0.14586050808429718}, {"id": 1191, "seek": 628484, "start": 6293.0, "end": 6298.2, "text": " if you have any feedback I know I rent more of these non-Andre sessions and I think Andres", "tokens": [50772, 498, 291, 362, 604, 5824, 286, 458, 286, 6214, 544, 295, 613, 2107, 12, 5289, 265, 11081, 293, 286, 519, 400, 495, 51032], "temperature": 0.0, "avg_logprob": -0.15344322559445403, "compression_ratio": 1.7763578274760383, "no_speech_prob": 0.14586050808429718}, {"id": 1192, "seek": 628484, "start": 6298.2, "end": 6302.2, "text": " are really helpful mitigating influence on that side of things so please do give us feedback", "tokens": [51032, 366, 534, 4961, 15699, 990, 6503, 322, 300, 1252, 295, 721, 370, 1767, 360, 976, 505, 5824, 51232], "temperature": 0.0, "avg_logprob": -0.15344322559445403, "compression_ratio": 1.7763578274760383, "no_speech_prob": 0.14586050808429718}, {"id": 1193, "seek": 628484, "start": 6302.2, "end": 6306.6, "text": " on that piece like you're not going to hurt my feelings I do just get excited about this stuff", "tokens": [51232, 322, 300, 2522, 411, 291, 434, 406, 516, 281, 4607, 452, 6640, 286, 360, 445, 483, 2919, 466, 341, 1507, 51452], "temperature": 0.0, "avg_logprob": -0.15344322559445403, "compression_ratio": 1.7763578274760383, "no_speech_prob": 0.14586050808429718}, {"id": 1194, "seek": 628484, "start": 6306.6, "end": 6311.32, "text": " and like talk about it so I'm looking forward to the next episode we'll catch you guys then", "tokens": [51452, 293, 411, 751, 466, 309, 370, 286, 478, 1237, 2128, 281, 264, 958, 3500, 321, 603, 3745, 291, 1074, 550, 51688], "temperature": 0.0, "avg_logprob": -0.15344322559445403, "compression_ratio": 1.7763578274760383, "no_speech_prob": 0.14586050808429718}, {"id": 1195, "seek": 631132, "start": 6311.88, "end": 6317.08, "text": " all righty so thank you Jeremy for filling in while I went to work we really", "tokens": [50392, 439, 558, 88, 370, 1309, 291, 17809, 337, 10623, 294, 1339, 286, 1437, 281, 589, 321, 534, 50652], "temperature": 0.0, "avg_logprob": -0.29705718711570456, "compression_ratio": 1.74, "no_speech_prob": 0.05880372226238251}, {"id": 1196, "seek": 631132, "start": 6317.88, "end": 6322.36, "text": " it's my bad for starting late or in the next future me you're welcome yeah", "tokens": [50692, 309, 311, 452, 1578, 337, 2891, 3469, 420, 294, 264, 958, 2027, 385, 291, 434, 2928, 1338, 50916], "temperature": 0.0, "avg_logprob": -0.29705718711570456, "compression_ratio": 1.74, "no_speech_prob": 0.05880372226238251}, {"id": 1197, "seek": 631132, "start": 6322.92, "end": 6328.12, "text": " so thank you listeners for tuning in to this week's episode of Nostra Kanihai I'll try to not", "tokens": [50944, 370, 1309, 291, 23274, 337, 15164, 294, 281, 341, 1243, 311, 3500, 295, 426, 555, 424, 591, 3782, 18230, 286, 603, 853, 281, 406, 51204], "temperature": 0.0, "avg_logprob": -0.29705718711570456, "compression_ratio": 1.74, "no_speech_prob": 0.05880372226238251}, {"id": 1198, "seek": 631132, "start": 6328.12, "end": 6333.16, "text": " do this figure I leave early constantly it's been a trend in the last few episodes as always", "tokens": [51204, 360, 341, 2573, 286, 1856, 2440, 6460, 309, 311, 668, 257, 6028, 294, 264, 1036, 1326, 9313, 382, 1009, 51456], "temperature": 0.0, "avg_logprob": -0.29705718711570456, "compression_ratio": 1.74, "no_speech_prob": 0.05880372226238251}, {"id": 1199, "seek": 631132, "start": 6333.16, "end": 6338.92, "text": " you appreciate you listenership and if you want to review or subscribe please do there are a few", "tokens": [51456, 291, 4449, 291, 23274, 1210, 293, 498, 291, 528, 281, 3131, 420, 3022, 1767, 360, 456, 366, 257, 1326, 51744], "temperature": 0.0, "avg_logprob": -0.29705718711570456, "compression_ratio": 1.74, "no_speech_prob": 0.05880372226238251}, {"id": 1200, "seek": 633892, "start": 6338.92, "end": 6344.04, "text": " comments that I saw roll in last week we could give it touched on so you might touch from the next", "tokens": [50364, 3053, 300, 286, 1866, 3373, 294, 1036, 1243, 321, 727, 976, 309, 9828, 322, 370, 291, 1062, 2557, 490, 264, 958, 50620], "temperature": 0.0, "avg_logprob": -0.37908498864424856, "compression_ratio": 1.3529411764705883, "no_speech_prob": 0.007756141480058432}, {"id": 1201, "seek": 633892, "start": 6344.04, "end": 6353.88, "text": " episode but yeah as always thank you and please keep tuning in", "tokens": [50620, 3500, 457, 1338, 382, 1009, 1309, 291, 293, 1767, 1066, 15164, 294, 51112], "temperature": 0.0, "avg_logprob": -0.37908498864424856, "compression_ratio": 1.3529411764705883, "no_speech_prob": 0.007756141480058432}, {"id": 1202, "seek": 636892, "start": 6368.92, "end": 6398.92, "text": " we'll finish", "tokens": [50364, 321, 603, 2413], "temperature": 1.0, "avg_logprob": -3.2667491912841795, "compression_ratio": 0.6, "no_speech_prob": 0.5221632122993469}, {"id": 1203, "seek": 642892, "start": 6428.92, "end": 6439.52, "text": " From girl Netsu robot, the headlines pop, made in driven dreams, they just don't stop.", "tokens": [50364, 3358, 2013, 426, 1385, 84, 7881, 11, 264, 23867, 1665, 11, 1027, 294, 9555, 7505, 11, 436, 445, 500, 380, 1590, 13, 50894], "temperature": 0.0, "avg_logprob": -0.5068434045669881, "compression_ratio": 1.3053435114503817, "no_speech_prob": 0.755598247051239}, {"id": 1204, "seek": 642892, "start": 6439.52, "end": 6445.26, "text": " Every breakthrough, every code unwritten, on the edge of change, we're excited we're", "tokens": [50894, 2048, 22397, 11, 633, 3089, 517, 26859, 11, 322, 264, 4691, 295, 1319, 11, 321, 434, 2919, 321, 434, 51181], "temperature": 0.0, "avg_logprob": -0.5068434045669881, "compression_ratio": 1.3053435114503817, "no_speech_prob": 0.755598247051239}, {"id": 1205, "seek": 644526, "start": 6445.26, "end": 6452.76, "text": " just knitting from machine learning marvels to coding kings, features unfolding, see what it brings.", "tokens": [50364, 445, 25498, 490, 3479, 2539, 23893, 82, 281, 17720, 21581, 11, 4122, 44586, 11, 536, 437, 309, 5607, 13, 50739], "temperature": 0.0, "avg_logprob": -0.6886194402521307, "compression_ratio": 1.1904761904761905, "no_speech_prob": 0.7591910362243652}], "language": "en"}