Congratulations!

[Valid Atom 1.0] This is a valid Atom 1.0 feed.

Recommendations

This feed is valid, but interoperability with the widest range of feed readers could be improved by implementing the following recommendations.

Source: http://icepick.info/feed.xml

  1. <?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="3.10.0">Jekyll</generator><link href="http://icepick.info//feed.xml" rel="self" type="application/atom+xml" /><link href="http://icepick.info//" rel="alternate" type="text/html" /><updated>2025-10-28T20:10:26-04:00</updated><id>http://icepick.info//feed.xml</id><title type="html">Shutup and Code</title><subtitle></subtitle><author><name>Myers Carpenter</name></author><entry><title type="html">Chrome DevTools CLI for Claude Code</title><link href="http://icepick.info//2025/10/28/chrome-devtools-cli-for-claude-code/" rel="alternate" type="text/html" title="Chrome DevTools CLI for Claude Code" /><published>2025-10-28T19:49:40-04:00</published><updated>2025-10-28T19:49:40-04:00</updated><id>http://icepick.info//2025/10/28/chrome-devtools-cli-for-claude-code</id><content type="html" xml:base="http://icepick.info//2025/10/28/chrome-devtools-cli-for-claude-code/"><![CDATA[<p>I’ve been using Claude Code to build many wondrous (but often half working)
  2. things.  Part of that is getting it to test on it’s own in a web browser.
  3. The official <a href="https://github.com/ChromeDevTools/chrome-devtools-mcp">Chrome DevTools
  4. MCP</a> kept freezing up
  5. while I was using it with a Meta Quest 3, and when it did work, asking for
  6. console logs would return massive amounts of data that filled my context
  7. (are there more dreaded words than “Compacting Conversation…”?).</p>
  8.  
  9. <p>Chrome DevTools Protocol (CDP) is Chrome’s way of letting external programs
  10. control the browser - taking screenshots, evaluating JavaScript, monitoring
  11. network traffic.  Most CDP tools are libraries meant to be imported into
  12. code, but Claude Code needs a CLI.</p>
  13.  
  14. <p>I built <code class="language-plaintext highlighter-rouge">@myerscarpenter/cdp-cli</code>.  It outputs NDJSON (newline-delimited
  15. JSON) - one complete JSON object per line, making it grep-compatible and
  16. easy to parse.</p>
  17.  
  18. <p>The key feature is the console command.  By default it outputs bare JSON
  19. strings and shows only the last 10 messages:</p>
  20.  
  21. <div class="language-bash highlighter-rouge"><div class="highlight"><pre class="highlight"><code>cdp-cli console <span class="s2">"GitHub"</span>
  22. <span class="s2">"Page loaded"</span>
  23. <span class="s2">"API call successful"</span>
  24. </code></pre></div></div>
  25.  
  26. <p>This saves tokens.  When you need more detail, flags like <code class="language-plaintext highlighter-rouge">--verbose</code>,
  27. <code class="language-plaintext highlighter-rouge">--tail 50</code>, or <code class="language-plaintext highlighter-rouge">--all</code> give you control over how much data comes back.
  28. When truncated, it warns on stderr so Claude Code knows there’s more
  29. available.</p>
  30.  
  31. <p>I’m also liking using cli’s over mcp’s as you can see exactly what it’s
  32. doing.</p>
  33.  
  34. <p>Chrome needs to be running with remote debugging enabled:</p>
  35.  
  36. <div class="language-bash highlighter-rouge"><div class="highlight"><pre class="highlight"><code>/Applications/Google<span class="se">\ </span>Chrome.app/Contents/MacOS/Google<span class="se">\ </span>Chrome <span class="se">\</span>
  37.  <span class="nt">--remote-debugging-port</span><span class="o">=</span>9222
  38. </code></pre></div></div>
  39.  
  40. <p>Or you could ask Claude to port forward 9222 from your Quest with
  41. <a href="https://developer.android.com/tools/adb">adb</a>.</p>
  42.  
  43. <p>Then install and use:</p>
  44.  
  45. <div class="language-bash highlighter-rouge"><div class="highlight"><pre class="highlight"><code>npm <span class="nb">install</span> <span class="nt">-g</span> @myerscarpenter/cdp-cli
  46. cdp-cli tabs
  47. cdp-cli console <span class="s2">"GitHub"</span> <span class="nt">--tail</span> 5
  48. cdp-cli screenshot <span class="s2">"GitHub"</span> <span class="nt">--output</span> screenshot.jpg
  49. </code></pre></div></div>
  50.  
  51. <p>Commands cover page management (tabs, new, go, close), debugging (console,
  52. snapshot, eval, screenshot), network inspection, and input automation
  53. (click, fill, key).</p>
  54.  
  55. <p>See it on <a href="https://github.com/myers/cdp-cli">GitHub</a> and
  56. <a href="https://www.npmjs.com/package/@myerscarpenter/cdp-cli">npm</a>.</p>]]></content><author><name>Myers Carpenter</name></author><category term="llm" /><category term="chrome" /><category term="meta-quest" /><category term="claude-code" /><summary type="html"><![CDATA[I’ve been using Claude Code to build many wondrous (but often half working) things. Part of that is getting it to test on it’s own in a web browser. The official Chrome DevTools MCP kept freezing up while I was using it with a Meta Quest 3, and when it did work, asking for console logs would return massive amounts of data that filled my context (are there more dreaded words than “Compacting Conversation…”?).]]></summary></entry><entry><title type="html">WebXR + WebGPU on Quest 3 via Virtual Desktop</title><link href="http://icepick.info//2025/10/24/webxr-webgpu-on-quest-via-virtual-desktop/" rel="alternate" type="text/html" title="WebXR + WebGPU on Quest 3 via Virtual Desktop" /><published>2025-10-24T11:03:39-04:00</published><updated>2025-10-24T11:03:39-04:00</updated><id>http://icepick.info//2025/10/24/webxr--webgpu-on-quest--via-virtual-desktop</id><content type="html" xml:base="http://icepick.info//2025/10/24/webxr-webgpu-on-quest-via-virtual-desktop/"><![CDATA[<p>I saw <a href="https://toji.dev/2025/03/03/experimenting-with-webgpu-in-webxr.html">this
  57. post</a>
  58. with a demo link in it, and wondered: can I use my quest 3 with virtual
  59. desktop to try this WebGPU + WebXR demo out?</p>
  60.  
  61. <p>WebGPU is the modern way to tap into your computer’s graphics card (GPU)
  62. directly from a web browser.  Before WebGPU, browsers used an
  63. older technology called WebGL, which worked but was kind of showing its age.
  64. WebGPU is faster, more efficient, and gives developers way more control over
  65. how they use the GPU.</p>
  66.  
  67. <p>WebXR you experience virtual reality (VR) and augmented reality (AR)
  68. directly through your web browser.  The “XR” stands for “extended reality,”
  69. which is just a catch-all term for both VR (where you’re fully immersed in a
  70. virtual world) and AR (where digital stuff gets overlaid on the real world).
  71. So if you’ve got a VR headset like a Meta Quest, WebXR lets websites tap
  72. into that hardware and create immersive experiences.</p>
  73.  
  74. <p>The Chrome team is working towards allowing you to use WebGPU to power
  75. WebXR, but it’s not in the navtive Quest Browser.</p>
  76.  
  77. <p>I couldn’t get a good answer about if it would work, and if not, why not
  78. from Claude Research.  The answer is it does work.</p>
  79.  
  80. <ol>
  81.  <li>Buy Virtual Desktop</li>
  82.  <li>In Chrome go to <code class="language-plaintext highlighter-rouge">chrome://flags</code> and turn on any flags to do with WebGPU.</li>
  83.  <li>Put on your Quest 3 and connect to your desktop via Virtual Desktop</li>
  84.  <li>go to
  85. <a href="https://toji.dev/2025/03/03/experimenting-with-webgpu-in-webxr.html"><code class="language-plaintext highlighter-rouge">https://toji.github.io/webgpu-metaballs/</code></a></li>
  86.  <li>in the panel on the right, open the WebXR section</li>
  87.  <li>enjoy the lava.</li>
  88. </ol>
  89.  
  90. <p>I tried a bunch of the <a href="https://threejs.org/examples/">threejs examples</a> and they don’t all work correctly.</p>]]></content><author><name>Myers Carpenter</name></author><category term="webxr webgpu" /><summary type="html"><![CDATA[I saw this post with a demo link in it, and wondered: can I use my quest 3 with virtual desktop to try this WebGPU + WebXR demo out?]]></summary></entry><entry><title type="html">Practical Deep Learning: Lesson 2: Is it a Hotdog? meets the Internet</title><link href="http://icepick.info//2023/04/30/practical-deep-learning-lesson-is-it-a-hotdog-meets-the-internet/" rel="alternate" type="text/html" title="Practical Deep Learning: Lesson 2: Is it a Hotdog? meets the Internet" /><published>2023-04-30T16:16:52-04:00</published><updated>2023-04-30T16:16:52-04:00</updated><id>http://icepick.info//2023/04/30/practical-deep-learning-lesson--is-it-a-hotdog-meets-the-internet</id><content type="html" xml:base="http://icepick.info//2023/04/30/practical-deep-learning-lesson-is-it-a-hotdog-meets-the-internet/"><![CDATA[<p><a href="https://course.fast.ai/">Practical Deep Learning for Coders</a> <a href="https://course.fast.ai/Lessons/lesson2.html">Lesson 2</a>: Let’s share this with the world.</p>
  91.  
  92. <p>Take aways from video:</p>
  93.  
  94. <ul>
  95.  <li>make a model, then clean the data.  The initial model you create will help you find the data that doesn’t seem to fit it’s generalization hypothesis.</li>
  96. </ul>
  97.  
  98. <p>I have shipped even more ML code: <a href="https://huggingface.co/spaces/myers/hotdog-or-not">Hotdog or Not?</a></p>
  99.  
  100. <figure>
  101.  <img src="/2023/04/30/hf-hotdog-or-not.jpg" width="2568" height="1122" />
  102.  <figcaption>my first model deployed on Hugging Face 🤗</figcaption>
  103. </figure>
  104.  
  105. <p>Hugging Face 🤗 is darn slick.  Your project is built into a docker image and then launched as needed.  Github could learn a thing or two about showing status.  There was some errors due to the <code class="language-plaintext highlighter-rouge">fastai</code> API changing to no longer needing you to wrap an image that you want to predict in a <code class="language-plaintext highlighter-rouge">PILImage</code> and another problem with adding example images, that I solved by just removing them.</p>]]></content><author><name>Myers Carpenter</name></author><category term="machine-learning" /><summary type="html"><![CDATA[Practical Deep Learning for Coders Lesson 2: Let’s share this with the world.]]></summary></entry><entry><title type="html">Practical Deep Learning: Lesson 1: Is it a Hotdog?</title><link href="http://icepick.info//2023/04/30/practical-deep-learning-lesson-is-it-a-hotdog/" rel="alternate" type="text/html" title="Practical Deep Learning: Lesson 1: Is it a Hotdog?" /><published>2023-04-30T15:51:34-04:00</published><updated>2023-04-30T15:51:34-04:00</updated><id>http://icepick.info//2023/04/30/practical-deep-learning-lesson--is-it-a-hotdog</id><content type="html" xml:base="http://icepick.info//2023/04/30/practical-deep-learning-lesson-is-it-a-hotdog/"><![CDATA[<p><a href="https://course.fast.ai/">Practical Deep Learning for Coders</a> <a href="https://course.fast.ai/Lessons/lesson1.html">Lesson 1</a></p>
  106.  
  107. <p>The biggest change since I last took a course on Machine Learning is one of the key points of this course:  the use of foundational models that you fine tune to get great results.  In this lesson’s video we fine tune an image classifier to see if a picture has a bird in it.</p>
  108.  
  109. <p>While building my own model I attempted to get the classifier fine tuned to look at comic book covers and tell me what publisher it was from.  I thought with the publishers mark on 100 issues from Marvel, DC, Dark Horse, and Image the classifier would be able to tell.  The best I was able to do was about 30% error rate, a far cry from the 0% in the example models.  I tried a few different ideas of how to improve:</p>
  110.  
  111. <ul>
  112.  <li>train with larger images.  The notebook used in the video makes the training go faster by reducing the size of the image.  As I write this I wonder if it is even possible to use larger images in a model that might have been trained on a fixed size.</li>
  113.  <li>create a smaller image by getting the 4 corners of the cover into one image.</li>
  114.  <li>clean the data so that all the covers in the dataset had a publisher mark on them.</li>
  115. </ul>
  116.  
  117. <figure>
  118.  <img src="/2023/04/30/X-Men - Red (2022-) 001-000.jpg" width="372" height="573" />
  119.  <figcaption>sample cover from <a href="https://www.comics.org/issue/2403738/">X-Men Red #2</a> with only the corners</figcaption>
  120. </figure>
  121.  
  122. <p>Nothing moved the needle.  I’m hoping something I learn later in the course will give me the insight I need to do better.</p>
  123.  
  124. <p>I took a second attempt with a simpler project.  Of course I remembered that Silicon Valley episode with the hot dog detector, and make a hot dog vs hamburger classifier.  It works great.  The next lesson covers getting a model like that into production, so hang tight.</p>]]></content><author><name>Myers Carpenter</name></author><category term="machine-learning" /><summary type="html"><![CDATA[Practical Deep Learning for Coders Lesson 1]]></summary></entry><entry><title type="html">Practical Deep Learning: Learning Game Plan</title><link href="http://icepick.info//2023/04/30/practical-deep-learning-learning-game-plan/" rel="alternate" type="text/html" title="Practical Deep Learning: Learning Game Plan" /><published>2023-04-30T15:31:21-04:00</published><updated>2023-04-30T15:31:21-04:00</updated><id>http://icepick.info//2023/04/30/practical-deep-learning-learning-game-plan</id><content type="html" xml:base="http://icepick.info//2023/04/30/practical-deep-learning-learning-game-plan/"><![CDATA[<p>I dug this video up from the older version of <a href="https://course.fast.ai/">Practical Deep Learning for Coders</a>.  It’s a “learn how to learn” type video.</p>
  125.  
  126. <iframe width="560" height="315" src="https://www.youtube.com/embed/gGxe2mN3kAg" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen=""></iframe>
  127.  
  128. <p>My take aways:</p>
  129.  
  130. <ul>
  131.  <li>Practical work is the goal.  Don’t fool yourself into thinking you know what is taught by just watching the videos or reading the book.  Watch the video, watch it again but with the tools at hand, pausing to try stuff out, do a project using what you learned.  The title of this blog is me trying to remind myself of this concept.</li>
  132.  <li>I have my own ML workstation setup with a GPU.  I used poetry to set it up.  This was a pain due to lack of attention to repeatability in Jupyter Notebooks.  It would be very useful if they had a “lock” feature that records all python packages in the environment they are run it and their exact version.  Below is what it looks like after finishing Lesson 1 (the next video).</li>
  133.  <li>Be tenacious.  Finish a project.</li>
  134.  <li>One message that’s very close to my heart is not keep on getting ready to do a project, like stopping to learn <a href="https://www.khanacademy.org/math/linear-algebra">linear algebra</a> (and then remembering how I never learned all the math terms, and therefore want to go back even deeper) in order to do well on this course.  Try to do a complete project, then on the next project go deeper, dig in deeper when the code needs it.</li>
  135.  <li>Show your work to the world.  Blog not to be a breaking news source, but blog for the audience of yourself 6 months ago.</li>
  136. </ul>
  137.  
  138. <p><code class="language-plaintext highlighter-rouge">pyproject.toml</code></p>
  139.  
  140. <div class="language-toml highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="nn">[tool.poetry]</span>
  141. <span class="py">name</span> <span class="p">=</span> <span class="s">"dl"</span>
  142. <span class="py">version</span> <span class="p">=</span> <span class="s">"0.1.0"</span>
  143. <span class="py">description</span> <span class="p">=</span> <span class="s">""</span>
  144. <span class="py">authors</span> <span class="p">=</span> <span class="p">[]</span>
  145. <span class="py">readme</span> <span class="p">=</span> <span class="s">"README.md"</span>
  146.  
  147. <span class="nn">[tool.poetry.dependencies]</span>
  148. <span class="py">python</span> <span class="p">=</span> <span class="s">"^3.10"</span>
  149. <span class="py">fastai</span> <span class="p">=</span> <span class="s">"2.7.11"</span>
  150. <span class="py">duckduckgo-search</span> <span class="p">=</span> <span class="s">"^2.8.5"</span>
  151. <span class="py">ipykernel</span> <span class="p">=</span> <span class="s">"^6.22.0"</span>
  152. <span class="py">ipywidgets</span> <span class="p">=</span> <span class="s">"^8.0.5"</span>
  153. <span class="py">jupyterlab</span> <span class="p">=</span> <span class="s">"^3.6.3"</span>
  154. <span class="py">jupyterlab-git</span> <span class="p">=</span> <span class="s">"^0.41.0"</span>
  155.  
  156. <span class="nn">[build-system]</span>
  157. <span class="py">requires</span> <span class="p">=</span> <span class="nn">["poetry-core"]</span>
  158. <span class="py">build-backend</span> <span class="p">=</span> <span class="s">"poetry.core.masonry.api"</span>
  159. </code></pre></div></div>]]></content><author><name>Myers Carpenter</name></author><category term="machine-learning" /><summary type="html"><![CDATA[I dug this video up from the older version of Practical Deep Learning for Coders. It’s a “learn how to learn” type video.]]></summary></entry><entry><title type="html">Do Your Own Deep Neural Network in Rust idea</title><link href="http://icepick.info//2023/04/30/do-your-own-deep-neural-network-in-rust-idea/" rel="alternate" type="text/html" title="Do Your Own Deep Neural Network in Rust idea" /><published>2023-04-30T12:59:32-04:00</published><updated>2023-04-30T12:59:32-04:00</updated><id>http://icepick.info//2023/04/30/do-your-own-deep-neural-network-in-rust-idea</id><content type="html" xml:base="http://icepick.info//2023/04/30/do-your-own-deep-neural-network-in-rust-idea/"><![CDATA[<p>Project idea:  Take <a href="https://monadmonkey.com/dnns-from-scratch-in-zig">DNNs from Scratch in Zig</a> and build it in Rust.</p>]]></content><author><name>Myers Carpenter</name></author><category term="machine-learning" /><category term="rust" /><category term="ideas" /><summary type="html"><![CDATA[Project idea: Take DNNs from Scratch in Zig and build it in Rust.]]></summary></entry><entry><title type="html">LLMs and the Truth™</title><link href="http://icepick.info//2023/04/30/llms-and-the-truth/" rel="alternate" type="text/html" title="LLMs and the Truth™" /><published>2023-04-30T12:01:46-04:00</published><updated>2023-04-30T12:01:46-04:00</updated><id>http://icepick.info//2023/04/30/llms-and-the-truth</id><content type="html" xml:base="http://icepick.info//2023/04/30/llms-and-the-truth/"><![CDATA[<p>The Economist covers <a href="https://archive.is/5l1k3">It doesn’t take much to make machine-learning algorithms go awry</a>.  Will we see a core of knowledge built that considered “The Truth”, and then all other input data is evaluated on how likely that’s true based on the givens?  LLMs judging what is fed to their younger siblings?</p>]]></content><author><name>Myers Carpenter</name></author><category term="machine-learning" /><category term="llm" /><summary type="html"><![CDATA[The Economist covers It doesn’t take much to make machine-learning algorithms go awry. Will we see a core of knowledge built that considered “The Truth”, and then all other input data is evaluated on how likely that’s true based on the givens? LLMs judging what is fed to their younger siblings?]]></summary></entry><entry><title type="html">Back to Machine Learning</title><link href="http://icepick.info//2023/04/27/back-to-machine-learning/" rel="alternate" type="text/html" title="Back to Machine Learning" /><published>2023-04-27T18:38:13-04:00</published><updated>2023-04-27T18:38:13-04:00</updated><id>http://icepick.info//2023/04/27/back-to-machine-learning</id><content type="html" xml:base="http://icepick.info//2023/04/27/back-to-machine-learning/"><![CDATA[<p>In 2012 I spent a lot of my rare free time working thru both <a href="https://twitter.com/SebastianThrun">Sebastian Thrun</a> and <a href="https://github.com/norvig">Peter Norvig</a>’s
  160. Intro to Artificial Intelligence (which I can’t find anymore) and <a href="https://twitter.com/AndrewYNg">Andrew Ng</a>’s Machine Learning course.  Andrew’s was the better of the two.</p>
  161.  
  162. <p>One thing that blew me away was how he used k-means clustering on our homework submissions to discover where there were a large number of students that had a common misconception, and make a clarification video.</p>
  163.  
  164. <p>At the time I was working in R&amp;D at Rosetta Stone, and we wanted to bring data science and machine learning to bear on how to improve our language learning offerings.  It was a linear course, and we dreamed of building a model of what our learners knew, and then challenging them on what they didn’t know.  <a href="https://archive.is/8X5GR">Duolingo</a> had a much better vision and execution for this.</p>
  165.  
  166. <p>With the breakthroughs in the last few years: <a href="https://stablediffusionweb.com/">Stable Diffusion</a>, <a href="https://ai.facebook.com/blog/large-language-model-llama-meta-ai/">LLaMA</a>, <a href="https://github.com/haotian-liu/LLaVA">LLaVA</a> (yes, I’m ignoring things I can’t run on my own computer), I wanted to dive back in and learn more.  To that end I’m taking <a href="https://twitter.com/jeremyphoward">Jeremy Howard’s</a> <a href="https://course.fast.ai/">Practical Deep Learning for Coders</a>.</p>
  167.  
  168. <p>I have a lot of data on comic books.  I’m hoping to build some practical applications for this.  One idea is to feed each panel into something like LLaVA and have it describe what’s going on in it, then have it summarize the story.</p>]]></content><author><name>Myers Carpenter</name></author><category term="machine-learning" /><summary type="html"><![CDATA[In 2012 I spent a lot of my rare free time working thru both Sebastian Thrun and Peter Norvig’s Intro to Artificial Intelligence (which I can’t find anymore) and Andrew Ng’s Machine Learning course. Andrew’s was the better of the two.]]></summary></entry><entry><title type="html">Hands On Rust</title><link href="http://icepick.info//2022/11/21/hands-on-rust/" rel="alternate" type="text/html" title="Hands On Rust" /><published>2022-11-21T22:36:57-05:00</published><updated>2022-11-21T22:36:57-05:00</updated><id>http://icepick.info//2022/11/21/hands-on-rust</id><content type="html" xml:base="http://icepick.info//2022/11/21/hands-on-rust/"><![CDATA[<p>I’ve been working thru <a href="https://pragprog.com/titles/hwrust/hands-on-rust/">Hands On
  169. Rust</a>.  You learn rust
  170. and make a game.  <a href="https://icepick.info/myersrogue/">Play my game</a>.  <a href="https://github.com/myers/myersrogue">See it
  171. on github</a>.</p>]]></content><author><name>Myers Carpenter</name></author><category term="rust" /><category term="video-games" /><category term="wasm" /><category term="bevy" /><summary type="html"><![CDATA[I’ve been working thru Hands On Rust. You learn rust and make a game. Play my game. See it on github.]]></summary></entry><entry><title type="html">HotWire’s Turbo with Django Bootstrap 5</title><link href="http://icepick.info//2022/08/10/hotwires-turbo-with-django-bootstrap/" rel="alternate" type="text/html" title="HotWire’s Turbo with Django Bootstrap 5" /><published>2022-08-10T20:59:42-04:00</published><updated>2022-08-10T20:59:42-04:00</updated><id>http://icepick.info//2022/08/10/hotwires-turbo-with-django-bootstrap-</id><content type="html" xml:base="http://icepick.info//2022/08/10/hotwires-turbo-with-django-bootstrap/"><![CDATA[<p>If you use turbo and django-bootstrap5 drop downs will not work once you
  172. load another page.  You can fix this by adding the
  173. <code class="language-plaintext highlighter-rouge">data-turbolinks-eval=false</code> attribute to bootstrap’s <code class="language-plaintext highlighter-rouge">&lt;script&gt;</code>.</p>
  174.  
  175. <p><code class="language-plaintext highlighter-rouge">settings.py</code></p>
  176.  
  177. <div class="language-python highlighter-rouge"><div class="highlight"><pre class="highlight"><code><span class="kn">from</span> <span class="nn">django_bootstrap5.core</span> <span class="kn">import</span> <span class="n">BOOTSTRAP5_DEFAULTS</span>
  178.  
  179. <span class="n">BOOTSTRAP5</span> <span class="o">=</span> <span class="p">{</span>
  180.    <span class="s">"javascript_in_head"</span><span class="p">:</span> <span class="bp">False</span><span class="p">,</span>
  181.    <span class="s">"javascript_url"</span><span class="p">:</span> <span class="n">BOOTSTRAP5_DEFAULTS</span><span class="p">[</span><span class="s">"javascript_url"</span><span class="p">].</span><span class="n">copy</span><span class="p">()</span>
  182. <span class="p">}</span>
  183. <span class="n">BOOTSTRAP5</span><span class="p">[</span><span class="s">"javascript_url"</span><span class="p">][</span><span class="s">"data-turbolinks-eval"</span><span class="p">]</span> <span class="o">=</span> <span class="s">"false"</span>
  184. </code></pre></div></div>]]></content><author><name>Myers Carpenter</name></author><category term="django" /><category term="hotwired" /><category term="turbo" /><summary type="html"><![CDATA[If you use turbo and django-bootstrap5 drop downs will not work once you load another page. You can fix this by adding the data-turbolinks-eval=false attribute to bootstrap’s &lt;script&gt;.]]></summary></entry></feed>

If you would like to create a banner that links to this page (i.e. this validation result), do the following:

  1. Download the "valid Atom 1.0" banner.

  2. Upload the image to your own server. (This step is important. Please do not link directly to the image on this server.)

  3. Add this HTML to your page (change the image src attribute if necessary):

If you would like to create a text link instead, here is the URL you can use:

http://www.feedvalidator.org/check.cgi?url=http%3A//icepick.info/feed.xml

Copyright © 2002-9 Sam Ruby, Mark Pilgrim, Joseph Walton, and Phil Ringnalda