Congratulations!

[Valid RSS] This is a valid RSS feed.

Recommendations

This feed is valid, but interoperability with the widest range of feed readers could be improved by implementing the following recommendations.

Source: http://endavid.com/kblog/endavid.feed.xml

  1. <?xml version="1.0"?>
  2. <rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
  3. <channel>
  4. <atom:link href="http://endavid.com/kblog/endavid.feed.xml" rel="self" type="application/rss+xml" />
  5. <title>EnDavid.com</title>
  6. <link>http://endavid.com/</link>
  7. <description>David Gavilan's home page updates.</description>
  8. <language>en-us</language>
  9. <copyright>Copyright 2009-2015, David Gavilan</copyright>
  10. <pubDate>Mon, 04 Mar 2024 11:16:24 +0000</pubDate>
  11. <lastBuildDate>Mon, 04 Mar 2024 11:16:24 +0000</lastBuildDate>
  12. <generator>KBlog 0.61</generator>
  13. <docs>http://www.rssboard.org/rss-specification</docs>
  14. <item>
  15. <link>http://endavid.com/index.php?entry=104</link>
  16. <guid>http://endavid.com/index.php?entry=104</guid>
  17.  
  18. <title>Concurrency recipes in Swift and C++</title>
  19.  
  20. <pubDate>Mon, 04 Mar 2024 11:16:24 +0000
  21. </pubDate>
  22.  
  23. <description><![CDATA[
  24. <figure>
  25.  <img src="http://endavid.com/pix/screenshots/2024-03-04-Jengatris-summary.jpg" alt="Screenshot of the benchmark with different running times"/>
  26.  <figcaption>Screenshot of the benchmark with different running times</figcaption>
  27. </figure>
  28.  
  29. <h3>Introduction</h3>
  30.  
  31. <p>
  32. There are many ways to create code that runs in parallel. In Swift in the Apple world it was common to use functions from <a href="https://en.wikipedia.org/wiki/Grand_Central_Dispatch">Grand Central Dispatch</a> (GCD), but in Swift 5.5 <code>async</code> and <code>await</code> became <a href="https://en.wikipedia.org/wiki/First-class_citizen">first-class citizens</a> of the language. That means there’s now a more Swift-like way to handle concurrent code.
  33. </p>
  34.  
  35. <p>
  36. I personally get confused with the meaning of some of the keywords across languages. For instance, the <code>async</code> keyword in Swift is not the same as the <code>async</code> function in C++ STD library. That’s why I wanted to compare at least Swift to C++.
  37. </p>
  38.  
  39. <p>
  40. This article shows different ways of running a loop in parallel in both Swift and C++, and I also compare the running times using a problem from <a href="https://adventofcode.com">Advent of Code</a> as benchmark.
  41. </p>
  42.  
  43. <h3>Advent of Code “Jengatris” as benchmark</h3>
  44.  
  45. <p>
  46. I’ve taken the problem from day 22 of Advent of Code 2023 as my benchmark. It’s a problem I like because it’s easy to visualize. In fact, <a href="https://x.com/endavid/status/1745527630521360395?s=20">I first solved it using the Godot Engine</a>, just so I could see it in 3D from the start.
  47. </p>
  48.  
  49. <p>
  50. Please read the full description of the problem <a href="https://adventofcode.com/2023/day/22">in the AoC website</a> for details. But the blunt summary is that you have to solve a game that it’s like a mixture of <a href="https://en.wikipedia.org/wiki/Tetris">Tetris</a> and <a href="https://en.wikipedia.org/wiki/Jenga">Jenga</a>, so I refer to it as <em>“Jengatris”</em>. 3-dimensional bricks or “pieces” fall from above until they stack. Then, in the first part of the problem, you have to find out which pieces are “essential”, that is, if such a piece gets removed other above it will fall. If more than one piece sustains another from above, then they are not “essential” (according to my definition of “essential”; of course if you remove both of them, the one above will fall).
  51. </p>
  52.  
  53. <p>
  54. In the second part of the problem, which it’s what we are interested in for the benchmark, you have to count how many pieces will fall if you were to remove one of the essential pieces. The answer to the problem is the sum of all the pieces that will fall for all the essential pieces.
  55. </p>
  56.  
  57. <p>
  58. Here’s a video made in Godot showing the result:
  59. </p>
  60.  
  61. <iframe class="video" width="315" height="560" src="https://www.youtube.com/embed/xWipB6MSyRc" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>
  62.  
  63. <!--
  64. https://youtube.com/shorts/xWipB6MSyRc?si=nyRK1pGQ3MT_ulIr
  65. -->
  66.  
  67. <p>
  68. In the first part of the video I let the pieces fall to find the solution of the first part. Then the screen flashes a bit because there are 974 essential pieces for my input, so I have to quickly simulate 974 scenarios to get the sum. Finally you can see I remove all the essentials, just to see more pieces fall (this is not part of the problem, but just for fun).
  69. </p>
  70.  
  71. <h3>Copyable game state</h3>
  72.  
  73. <p>
  74. Because we have to simulate each possibility for all the essential pieces, it is very convenient if we can easily create copies of the game state. This will also be very helpful when computing several solutions in parallel, because we want to avoid sharing any data to avoid race conditions.
  75. </p>
  76.  
  77. <p>
  78. In Swift we can simply use a <code>struct</code>, because structs in Swift are passed by value. In contrast, classes are passed by reference, so no data gets duplicated. So to be able to copy the whole game state, I simply declared it like this:
  79. </p>
  80.  
  81. <pre class="prettyprint">
  82.  struct GameState {
  83.      var pieces: [AABB&lt;Int&gt;]
  84.      var volume: VoxelVolume&lt;Int&gt;
  85.  }
  86. </pre>
  87.  
  88. <p>
  89. A game state contains a list of pieces described as integer Axis-Aligned Bounding Boxes (AABBs), and what I called a Voxel Volume, which it’s a 3-dimensional integer matrix that stores the ID of each piece, to know whether that integer coordinate is occupied or not.
  90. </p>
  91.  
  92. <p>
  93. Note that in C++ a <code>struct</code> behaves very differently. In C++ a struct is basically the same as a class, except that all its members are public by default. So to be really explicit that I want to duplicate the game state, I added a <code>‘copy’</code> function, and explicitly deleted the copy constructor and the copy-assignment operator:
  94. </p>
  95.  
  96. <pre class="prettyprint">
  97.  struct GameState
  98.  {
  99.    std::vector&lt; AABB&lt;int&gt; &gt; pieces;
  100.    std::shared_ptr&lt;VoxelVolume&lt;int&gt; &gt; volume;
  101.    
  102.    GameState(IntAABBList pieces, std::shared_ptr&lt;VoxelVolume&lt;int&gt; &gt; volume);
  103.    std::shared_ptr&lt;GameState&gt; copy() const;
  104.    GameState(const GameState&) = delete;
  105.    GameState& operator=(const GameState&) = delete;
  106.  };
  107. </pre>
  108.  
  109. <p>
  110. The copy function looks like this:
  111. </p>
  112.  
  113. <pre class="prettyprint">
  114.  std::shared_ptr&lt;Jengatris::GameState&gt; Jengatris::GameState::copy() const
  115.  {
  116.    std::vector&lt;AABB&lt;int&gt; &gt; p(pieces.begin(), pieces.end());
  117.    return std::make_shared&lt;Jengatris::GameState&gt;(p, volume->copy());
  118.  }
  119. </pre>
  120.  
  121. <p>
  122. Note that the Voxel Volume has a copy function as well. You can see the whole code in Github: <a href="https://github.com/endavid/algoDeSwift/tree/main/AoC%202023"> algoDeSwift/AoC2023</a>.
  123. </p>
  124.  
  125. <p>
  126. We have done the most important thing for concurrency. Now let’s see how to run the simulations in parallel.
  127. </p>
  128.  
  129. <h3>Parallel loop in Swift with GCD</h3>
  130.  
  131. <p>
  132. The sequential solution in Swift can be written functionally with the <code>‘reduce’</code> function (remember that the answer to part 2 of the problem is the sum of all possible simulated scenarios):
  133. </p>
  134.  
  135. <pre class="prettyprint">
  136.  static func countFalls(state: GameState, ids: Set&lt;Int&gt;) -&gt; Int {
  137.      return ids.reduce(0) { sum, i in
  138.          let (_, n) = Jengatris.simulate(start: state, without: i)
  139.          return sum + n
  140.      }
  141.  }
  142. </pre>
  143.  
  144. <p>
  145. It is quite straightforward to rewrite a loop as a parallel loop with GCD:
  146. </p>
  147.  
  148. <pre class="prettyprint">
  149.  static func concurrentCountFalls(state: GameState, ids: Set&lt;Int&gt;) -&gt; Int {
  150.    let indexArray: [Int] = Array(ids)
  151.    var counts = [Int].init(repeating: 0, count: indexArray.count)
  152.    DispatchQueue.concurrentPerform(iterations: indexArray.count) { iteration in
  153.        let id = indexArray[iteration]
  154.        let (_, n) = Jengatris.simulate(start: state, without: id)
  155.        counts[iteration] = n
  156.    }
  157.    return counts.reduce(0, +)
  158.  }
  159. </pre>
  160.  
  161. <p>
  162. Notice that we created a shared resource, <code>‘counts’</code>, where we save all the intermediate results. But we don’t need any <a href="https://en.wikipedia.org/wiki/Lock_(computer_science)">mutex</a> for this because we aren’t resizing the array and each thread will only write to its unique position, given by <code>‘iteration’</code>. The number of threads created will be automatically decided by GCD depending on the number of cores of the CPU and the capabilities of the hardware. In my Mac mini M1 it creates 8 threads.
  163. </p>
  164.  
  165. <h3>Parallel loop in Swift with async/await</h3>
  166.  
  167. <p>
  168. As I mentioned in the introduction, Swift 5.5 introduced some keywords for asynchronous code: <code>‘async’</code> and <code>‘await’</code>. Note that because the code is asynchronous it doesn’t necessarily mean it runs in parallel. For instance, Javascript is very asynchronous, but it’s mostly single-threaded (unless you are using Workers). During each “tick” of the event loop the different callbacks get updated in Javascript.
  169. </p>
  170.  
  171. <p>
  172. But concurrent code is asynchronous by nature, so it is useful to have first-class citizens asynchronous keywords in the language to write concurrent code. You just need to flag a function with <code>‘async’</code> to mark it asynchronous. And then you can use the <code>‘await’</code> to wait for something to finish before continuing with the rest of the code, without actually blocking the execution of the program.
  173. </p>
  174.  
  175. <p>
  176. There’s a special function called <code>‘withTaskGroup’</code> that it’s very helpful for our example. It creates a <a href="https://developer.apple.com/documentation/swift/taskgroup">Task Group</a> where you can keep adding <a href="https://developer.apple.com/documentation/swift/task">Tasks</a>. In our case each task will be one of the simulations. Then, we simply wait for all the results to come back. Here’s the code:
  177. </p>
  178.  
  179. <pre class="prettyprint">
  180.  static func countFallsAsync(state: GameState, ids: Set&lt;Int&gt;) async -&gt; Int {
  181.      var sum = 0
  182.      await withTaskGroup(of: Int.self) { group in
  183.          for id in ids {
  184.              group.addTask {
  185.                  let (_, n) = Jengatris.simulate(start: state, without: id)
  186.                  return n
  187.              }
  188.          }
  189.          for await n in group {
  190.              sum += n
  191.          }
  192.      }
  193.      return sum
  194.  }
  195. </pre>
  196.  
  197. <p>
  198. Here we don’t have to worry about the number of threads created either. The system will choose for us. The performance should be the same as with GCD.
  199. </p>
  200.  
  201. <h3>Parallel loop in C++ with threads</h3>
  202.  
  203. <p>
  204. Let’s see how the C++ code compares to Swift. Let’s start for writing down the sequential version of it. Instead of using the <code>‘reduce’</code> function, I used a <code>for-loop</code> for this one because I think it’s easier to read:
  205. </p>
  206.  
  207. <pre class="prettyprint">
  208.  size_t Jengatris::countFalls(const GameState& state, const std::unordered_set&lt;int&gt;& ids)
  209.  {
  210.    size_t sum = 0;
  211.    for (const auto& id : ids)
  212.    {
  213.        std::unordered_set&lt;int&gt; moved;
  214.        auto s = state.copy();
  215.        auto _ = Jengatris::simulate(*s, id, &moved);
  216.        sum += moved.size();
  217.    }
  218.    return sum;
  219.  }
  220. </pre>
  221.  
  222. <p>
  223. A simple solution that works is to create a thread for each element in the input. That creates lots of threads, though. My input has 974 entries, so that’s 974 threads. The code below creates a lambda function with the work each thread needs to do, creates all the threads, and then waits for all of them to finish with <code>‘join’</code>:
  224. </p>
  225.  
  226. <pre>
  227.  size_t Jengatris::countFallsThreaded(const GameState &state, const std::unordered_set&lt;int&gt; &ids)
  228.  {
  229.    std::vector&lt;int&gt; idArray(ids.begin(), ids.end());
  230.    std::vector&lt;size_t&gt; counts(ids.size());
  231.    std::vector&lt;std::thread&gt; threads;
  232.    auto parallelWork = [&state, &idArray, &counts](int iteration) {
  233.        int id = idArray[iteration];
  234.        std::unordered_set&lt;int&gt; moved;
  235.        auto s = state.copy();
  236.        auto _ = Jengatris::simulate(*s, id, &moved);
  237.        counts[iteration] = moved.size();
  238.    };
  239.    // this will start MANY threads!! (974 threads for my input)
  240.    for (size_t i = 0; i &lt; idArray.size(); i++)
  241.    {
  242.        threads.emplace_back(parallelWork, i);
  243.    }
  244.    // Wait for threads to finish
  245.    for (auto& thread : threads) {
  246.        thread.join();
  247.    }
  248.    return std::accumulate(counts.begin(), counts.end(), 0);
  249.  }
  250. </pre>
  251.  
  252. <h3>Parallel loop in C++ with async</h3>
  253.  
  254. <p>
  255. In C++ <a href="https://en.cppreference.com/w/cpp/thread/async">‘std::async’</a> is a function template used to run a function asynchronously, potentially in a separate thread which might be part of a thread pool. This is different from flagging a function asynchronous with ‘async’ in Swift.
  256. </p>
  257.  
  258. <p>
  259. This can be used in combination with <code>‘futures’</code> to wait for results. Javascript programmers may be used to Futures when using Promises. These are the same. In fact, an <a href="https://en.cppreference.com/w/cpp/thread/future">std::future</a> can also be paired with an <a href="https://en.cppreference.com/w/cpp/thread/promise">std::promise</a>. But here we are interested in the asynchronous function. My code looks like this now:
  260. </p>
  261.  
  262. <pre class="prettyprint">
  263.  size_t Jengatris::countFallsAsync(const GameState &state, const std::unordered_set&lt;int&gt; &ids)
  264.  {
  265.    std::vector&lt;int&gt; idArray(ids.begin(), ids.end());
  266.    std::vector&lt;std::future&lt;size_t&gt;&gt; futures;
  267.    auto parallelWork = [&state, &idArray](int iteration) {
  268.        int id = idArray[iteration];
  269.        std::unordered_set&lt;int&gt; moved;
  270.        auto s = state.copy();
  271.        auto _ = Jengatris::simulate(*s, id, &moved);
  272.        return moved.size();
  273.    };
  274.    // Start asynchronous tasks
  275.    for (size_t i = 0; i &lt; idArray.size(); ++i) {
  276.        futures.push_back(std::async(std::launch::async, parallelWork, i));
  277.    }
  278.    // Wait for tasks to finish and accumulate the results
  279.    // When I put a breakpoint here, Xcode says there are 372 threads.
  280.    // Still lots, but less than 974...
  281.    size_t total = 0;
  282.    for (auto& future : futures) {
  283.        total += future.get();
  284.    }
  285.    return total;
  286.  }
  287. </pre>
  288.  
  289. <p>
  290. The code is a bit ugly, though. We can do better with Threading Building Blocks (TBB).
  291. </p>
  292.  
  293. <h3>Parallel loop in C++ with C++17 and TBB</h3>
  294.  
  295. <p>
  296. C++17 does include parallel algorithms, such as <a href="https://en.cppreference.com/w/cpp/algorithm/transform_reduce.">transform_reduce</a>. You can pass an execution policy, and if you specify <code>‘parallel’</code>, it should run in parallel. So my parallel loop can be cleanly written like this:
  297. </p>
  298.  
  299. <pre class="prettyprint">
  300.  size_t Jengatris::countFallsParallel(const GameState &state, const std::unordered_set&lt;int&gt; &ids)
  301.  {
  302.    return std::transform_reduce(
  303.        std::execution::par,
  304.        ids.begin(),
  305.        ids.end(),
  306.        size_t(0),
  307.        std::plus&lt;&gt;(),
  308.        [&state](int id) {
  309.            std::unordered_set&lt;int&gt; moved;
  310.            auto s = state.copy();
  311.            auto _ = Jengatris::simulate(*s, id, &moved);
  312.            return moved.size();
  313.        }
  314.    );
  315.  }
  316. </pre>
  317.  
  318. <p>
  319. However, that code does not compile with Apple Clang. If you check the <a href="https://en.cppreference.com/w/cpp/compiler_support/17#C.2B.2B17_library_features">compiler support page</a>, parallel algorithms are not supported in Apple Clang. To compile that in macOS, I installed GCC with Homebrew, and TBB (see below). The code compiles and runs. However, it doesn’t seem to run in parallel. The performance is the same as the sequential version. So I rewrote it with Intel TBB.
  320. </p>
  321.  
  322. <p>
  323. <a href="https://en.wikipedia.org/wiki/Threading_Building_Blocks">TBB</a> is a C++ template library developed by Intel for parallel programming. The code got slightly uglier, but here’s the same parallel loop:
  324. </p>
  325.  
  326. <pre class="prettyprint">
  327.  size_t Jengatris::countFallsTBB(const GameState &state, const std::unordered_set&lt;int&gt; &ids)
  328.  {
  329.    std::vector&lt;int&gt; idArray(ids.begin(), ids.end());
  330.    size_t sum = tbb::parallel_reduce(
  331.         tbb::blocked_range&lt;size_t&gt;(0, idArray.size()),
  332.         size_t(0),
  333.         [&](const tbb::blocked_range&lt;size_t&gt;& range, size_t localSum) {
  334.             for (size_t i = range.begin(); i != range.end(); ++i) {
  335.                 int id = idArray[i];
  336.                 std::unordered_set&lt;int&gt; moved;
  337.                 auto s = state.copy();
  338.                 auto _ = Jengatris::simulate(*s, id, &moved);
  339.                 localSum += moved.size();
  340.             }
  341.             return localSum;
  342.         },
  343.         std::plus&lt;&gt;()
  344.     );
  345.     return sum;
  346.  }
  347. </pre>
  348.  
  349. <p>
  350. For some reason, Homebrew doesn’t automatically find the headers and libraries, so I had to point at them manually. For reference, my compiler command is:
  351. </p>
  352.  
  353. <pre class="prettyprint">
  354. g++-13 -std=c++17 -O3 -Wall -Wextra -pedantic -o advent advent2023-cpp/*.cpp -ltbb -I/opt/homebrew/Cellar/tbb/2021.11.0/include/ -L/opt/homebrew/Cellar/tbb/2021.11.0/lib
  355. </pre>
  356.  
  357. <h3>Benchmark</h3>
  358.  
  359. <p>
  360. I’ve collected some numbers, mostly to verify that indeed the parallel code runs faster. I’ve compiled both Swift and C++ versions in Release mode optimizing for speed, not size, and averaged the values of a few runs. See the graph below.
  361. </p>
  362.  
  363. <figure>
  364.  <img src="http://endavid.com/pix/screenshots/2024-03-04-jengatris-performance.png" alt="Running times of Jengatris with different implementations and compilers"/>
  365.  <figcaption>Running times of “Jengatris” with different implementations and compilers</figcaption>
  366. </figure>
  367.  
  368. <p>
  369. A few observations:
  370. </p>
  371. <ul>
  372. <li>Any version runs much faster than GDScript, which takes 24 seconds.</li>
  373. <li>The C++ code runs a bit faster than Swift.</li>
  374. <li>Clang performs slightly faster than GCC.</li>
  375. <li>All the parallel implementations perform similarly to each other, in their respective languages/compilers, except for the “GCC parallel” which doesn’t seem to be working as expected.</li>
  376. <li>The C++ solution with many threads seems slightly slower than the other C++ solutions in GCC. Presumably because of the overhead of creating that many threads, although it doesn’t seem to make a difference in Clang.</li>
  377. </ul>
  378.  
  379. <h3>Summary</h3>
  380.  
  381. <p>
  382. There are multiple ways to do computations in parallel in both Swift and C++, some approaches more modern than others. However, always prepare some benchmark to actually check that the solution works. You may get a surprise like the one I got with C++17 (if someone knows why the parallel execution is being ignored, please let me know).
  383. </p>
  384.  
  385. <p>
  386. In the C++ world, go to the <a href="https://en.cppreference.com/w/cpp/compiler_support">Compiler Support</a> to find out which features of the standard are actually implemented in the compiler you are using. If you are targetting multiple platforms, you may not want to use the parallel algorithms from C++17.
  387. </p>
  388.  
  389. <p>
  390. C++ sounds a bit scarier than Swift, but I hope with these comparisons you see C++ doesn’t need to be that much verbose.
  391. </p>
  392.  
  393. <p>
  394. Thanks to Eric Wastl for <a href="https://adventofcode.com/2023/about">Advent of Code</a>.
  395. </p>
  396.  
  397. <p>
  398. All the code can be found in Github: <a href="https://github.com/endavid/algoDeSwift/tree/main/AoC%202023">algoDeSwift/AoC2023</a>.
  399. </p>
  400. ]]>
  401. </description>
  402. </item>
  403.  
  404. <item>
  405. <link>http://endavid.com/index.php?entry=103</link>
  406. <guid>http://endavid.com/index.php?entry=103</guid>
  407.  
  408. <title>Mantis Shrimp: Image Differences with Metal shaders</title>
  409.  
  410. <pubDate>Mon, 26 Feb 2024 11:53:27 +0000
  411. </pubDate>
  412.  
  413. <description><![CDATA[
  414. <figure>
  415.  <img src="http://endavid.com/pix/screenshots/2024-02-23-Mantis-Shrimp-title.png" alt="Mantis Shrimp image diff tool for macOS"/>
  416.  <figcaption>Mantis Shrimp image diff tool for macOS</figcaption>
  417. </figure>
  418.  
  419. <h3>Image Diffs and Mantis Shrimp</h3>
  420.  
  421. <p>
  422. An <em>image diff</em> is an image that visualizes the difference between 2 other images in some manner. There are many image diff tools around, but I often find myself wanting to write my custom difference operator, depending on the thing I'm looking for in the image.
  423. </p>
  424.  
  425. <p>
  426. A mentor I had created once an internal tool in Javascript to do image diffs where you could write a snippet of Javascript. It was very useful, but the code ran in the CPU for each pixel, so it was quite slow. Also, it could only deal with the types of images that the browser could handle, that is, usually just 8-bit images in sRGB color space. He called this web app <a href="https://en.wikipedia.org/wiki/Mantis_shrimp">Mantis Shrimp</a>, one of his favorite animals. The reason: mantis shrimps have up to 16 different type of photoreceptor cells. In comparison, humans have just 3 different type of cells (although some people have <a href="https://en.wikipedia.org/wiki/Tetrachromacy">tetrachromacy</a>). But who needs that many types of photoreceptors when we have technology and software to enhance what we see?
  427. </p>
  428.  
  429. <p>
  430. I borrowed that awesome name for my <a href="http://mantisshrimp.endavid.com">Mantis Shrimp app</a>, although this app can do more than the original. It can compute image diffs of any 2 images that macOS supports, that is, up to 32 bits per color channel, and different types of color spaces. It does it in real time because the operations happen in the GPU, so pixel operations are done in parallel, not sequentially. The app comes with different preset operators, but you can write your own with Metal shaders as well.
  431. </p>
  432.  
  433. <p>
  434. Because you can write shaders with it, you can do much more than just image differences. You can even create animations with it, pretty much like what <a href="https://www.shadertoy.com">Shader Toy</a> does in the WebGL world.
  435. </p>
  436.  
  437. <p>
  438.  Here's a 30-second video summary of what Mantis Shrimp can do:
  439. </p>
  440.  
  441.  <iframe class="video" width="520" height="292" src="https://www.youtube.com/embed/ijJRaahCF0c" frameborder="0" allowfullscreen></iframe>
  442.  
  443. <p>
  444. In this article I’m going to give you some details about the actual implementation of Mantis Shrimp.
  445. </p>
  446.  
  447. <h3>SwiftUI and Metal</h3>
  448.  
  449. <p>
  450. In WWDC2023 Apple announced new functions to modify a Swift view with custom shaders: <a href="https://developer.apple.com/documentation/swiftui/view/coloreffect(_:isenabled:)">colorEffect</a>, <a href="https://developer.apple.com/documentation/swiftui/view/layereffect(_:maxsampleoffset:isenabled:)">layerEffect</a>, and <a href="https://developer.apple.com/documentation/swiftui/view/distortioneffect(_:maxsampleoffset:isenabled:)">distortionEffect</a>. Distortions modify the location of each pixel, whereas the other two modify its location. I assume they must be fragment/pixel shaders. You can find some nice examples in <a href="https://www.hackingwithswift.com/quick-start/swiftui/how-to-add-metal-shaders-to-swiftui-views-using-layer-effects">How to add Metal shaders to SwiftUI views using layer effects - a free SwiftUI by Example tutorial</a>.
  451. </p>
  452.  
  453. <p>
  454. However, if you want to do something more complex than that and you plan to use SwiftUI, you will need to create a custom <a href="https://developer.apple.com/documentation/swiftui/uiviewrepresentable">UIViewRepresentable</a>. You can find an example of this in the Apple forums: <a href="https://forums.developer.apple.com/forums/thread/119112">MetalKit in SwiftUI</a>.
  455. </p>
  456.  
  457. <p>
  458. For Mantis Shrimp I followed that route, and I encapsulated all the rendering using the <code>Renderer</code> class of my <a href="https://github.com/endavid/VidEngine">VidEngine</a>, an open-source graphics engine I created a couple of years back. VidEngine uses Swift and Metal, but at the time of writing I haven’t released the changes to make it work with macOS and SwiftUI.
  459. </p>
  460.  
  461. <h3>Mantis Shrimp render passes</h3>
  462.  
  463. <p>
  464. In VidEngine a <em>“render graph”</em> is simply an ordered list of what I call <em>“plugins”</em>. A plugin encapsulates one or more render passes. A real render graph should be a Directed Acyclic Graph where the nodes are render passes and each node is connected to others through read and write dependencies between the resources they use. One of the best explanations I found of render graphs is in this blog: <a href="https://poniesandlight.co.uk/reflect/island_rendergraph_1/">Rendergraphs and how to implement one</a>.
  465. </p>
  466.  
  467. <p>
  468. Because there are only 3 plugins in Mantis Shrimp, the dependencies are hard-coded. One of the plugins is for <em>Mesh Shaders</em>, that I will discuss in a separate article. Most of the time that plugin is disabled. The other two plugins are the <code>DiffPlugin</code> and the <code>OutPlugin</code>.
  469. </p>
  470.  
  471. <p>
  472. The <code>DiffPlugin</code> is where the actual operation happens. It consists of a simple vertex shader that draws a full-screen rectangle, and a fragment shader with the per-pixel operation. This fragment shader can be substituted by your own code. Apart from the texture coordinates of each pixel, I pass some other variables such as the time in seconds, so you can create animations. Read <a href="http://mantisshrimp.endavid.com/manual.html">the manual</a> for details.
  473. </p>
  474.  
  475. <p>
  476. The <code>DiffPlugin</code> writes the output to an image that it’s the same size, the same bit-depth, and the same color space as the first input image. You can only export images as PNG at the moment, but it should preserve its size, its bit depth, and its color space.
  477. </p>
  478.  
  479. <p>
  480. What you see on screen, though, it’s what the <code>OutPlugin</code> shows you. Its input is the output of the <code>DiffPlugin</code>, and it adapts it to the current view. By default it uses point sampling, so if your image is a few pixels wide, you should see a pixelated image, not a blurred one (if it were linearly interpolated). This is important because in an image diff tool you want to see the details, not a blurred version of them! The view supports display-P3 color space by default, but the pixel format that gets selected may vary depending on the hardware.
  481. </p>
  482.  
  483. <p>
  484. The <code>OutPlugin</code> may also apply the final gamma where necessary. Some pixel formats support the sRGB flag for automatically applying the gamma (or inverse gamma when reading), but it’s not for all pixel formats and its support varies depending on the hardware, so the operation needs to be done in a shader.
  485. </p>
  486.  
  487. <h3>A diff fragment shader</h3>
  488.  
  489. <p>
  490.  A simple difference operator looks like this:
  491. </p>
  492.  
  493. <pre class="prettyprint">
  494. fragment half4 main()
  495. {
  496.    float4 a = texA.sample(sam, frag.uv);
  497.    float4 b = texB.sample(sam, frag.uv);
  498.    float4 diff = abs(a-b);
  499.    float4 out = float4(uni.scale * diff.rgb, a.a);
  500.    return half4(out);
  501. }  
  502. </pre>
  503.  
  504. <p>
  505.  The signature of the function is predefined, and <code>“main”</code> is just a shortcut I’ve defined in Mantis Shrimp, because the function signature can’t be overridden. The actual signature looks like this:
  506. </p>
  507.  
  508. <pre class="prettyprint">
  509. fragment half4 diffFragment(VertexInOut frag [[stage_in]],
  510.   texture2d<float> texA [[ texture(0) ]],
  511.   texture2d<float> texB [[ texture(1) ]],
  512.   sampler sam [[ sampler(0) ]],
  513.   constant Uniforms& uni  [[ buffer(0) ]])
  514. </pre>
  515.  
  516. <p>
  517.  So apart from the fragment texture coordinates, you get two textures, a texture sampler, and some extra variables. The operation above is just subtracting the RGB values of both textures and setting the output to be the absolute value of the difference. Here’s a summary of the different diff presets in Mantis Shrimp:
  518. </p>
  519.  
  520. <figure>
  521.  <img src="http://mantisshrimp.endavid.com/pix/image-diff-summary-david.jpg" width="640" alt="Mantis Shrimp image diff presets"/>
  522.  <figcaption>Mantis Shrimp image diff presets</figcaption>
  523. </figure>
  524.  
  525. <p>
  526.  When no image is assigned, a white texture is sampled by default. That means that the default RGB diff operator acts as a negative if you only assign one image. See the example below.
  527. </p>
  528.  
  529. <figure>
  530.  <img src="http://mantisshrimp.endavid.com/pix/Screenshot 2024-02-26 at 10.44.27.jpg" width="480" alt="Negative painting from Iranian-British artist Soheila Sokhanvari. By default Mantis Shrimp will negate the input."/>
  531.  <figcaption>Negative painting from Iranian-British artist Soheila Sokhanvari. By default Mantis Shrimp will negate the input.</figcaption>
  532. </figure>
  533.  
  534. <h3>A shader sandbox</h3>
  535.  
  536. <p>
  537.  Mantis Shrimp can also be used to simply test shaders. People familiar with <a href="https://www.shadertoy.com">Shader Toy</a> or <a href="https://twigl.app">TwiGL</a> will know that you can create beautiful animations with just a fragment shader.
  538. </p>
  539.  
  540. <p>
  541. A common mathematical tool for that purpose is the use of <a href="https://en.wikipedia.org/wiki/Signed_distance_function">Signed Distance Functions</a> (SDF). An SDF is a function that tells you how far a point is from the surface of the object. When you are inside the object, the distance is negative, hence the “signed”. Because in a fragment shader you get the <code>(u,v)</code> texture coordinate of the output, you can use an SDF to draw simple 2D figures. For instance, a circle centered at <code>(0,0)</code> is just the length of the <code>(u,v)</code> vector minus the radius of the circle.
  542. </p>
  543.  
  544. <p>
  545. If you apply transforms to the <code>(u,v)</code> coordinates, you can do more fancy things. One common transformation is to multiply the <code>(u,v)</code> coordinates by a number greater than one and then taking its fractional part, the decimals. In this manner, you will have repeating coordinates that go from 0 to 1, and then from 0 to 1 again. If you use the time variable to change these transforms across time, you can create some interesting animations. Mantis Shrimp comes with this SDF animation preset to get you started:
  546. </p>
  547.  
  548. <pre class="prettyprint">
  549. float sdCircle(float2 p, float r)
  550. {
  551.    return length(p) - r;
  552. }
  553.  
  554. float2x2 rotationMatrix(float angle)
  555. {
  556.    float s=sin(angle), c=cos(angle);
  557.    return float2x2(
  558.        float2(c, -s),
  559.        float2(s,  c)
  560.    );
  561. }
  562.  
  563. fragment half4 main()
  564. {
  565.    float t = uni.time;
  566.    float aspect = uni.resolution.x / uni.resolution.y;
  567.    float2 uv0 = frag.uv * 2 - 1;
  568.    uv0.x *= aspect;
  569.    float2x2 r = rotationMatrix(cos(t));
  570.    uv0 = r * uv0;
  571.    float2 uv = fract(2 * uv0) - 0.5;
  572.    float d = sdCircle(uv, 0.5) * exp(-length(uv0));
  573.    float s = uni.scale + 1;
  574.    d = sin(d*s + t) / s;
  575.    d = 0.01 / abs(d);
  576.    float2 uvImage = 0.5 * float2(sin(t) + 1, cos(t) + 1);
  577.    float4 color = texA.sample(sam, uvImage);
  578.    float4 out = float4(d * color.rgb, 1);
  579.    return half4(out);
  580. }
  581. </pre>
  582.  
  583. <p>
  584. The output looks like this:
  585. </p>
  586.  
  587. <video width="256" height="256" controls autoplay loop>
  588.  <source src="http://mantisshrimp.endavid.com/pix/2023-12-14-mantisshrimp-circles.mp4" type="video/mp4">
  589.  Your browser does not support the video tag.
  590. </video>
  591.  
  592. <p>
  593. SDFs can also be used to represent 3D surfaces. The SDF for a sphere is the same as a circle, but we use the length of an <code>(x,y,z)</code> coordinate instead of a 2D coordinate. Usually these 3D SDFs are combined with a technique called <a href="https://en.wikipedia.org/wiki/Ray_marching">Ray Marching</a>, which consists of casting a ray for every <code>(u,v)</code> coordinate in the screen, starting in the near plane of the camera frustum, and advance the ray along depth based on the value of the SDF. Remember that the SDF tells you the distance to the surface, so you basically know how far you need to move.
  594. </p>
  595.  
  596. <p>
  597. There are plenty of resources online to learn about this. Check out <a href="https://iquilezles.org">Iñigo Quilez home page</a>. He’s the creator of Shader Toy and he has many interesting resources. The important thing for this article is to highlight that you can use Metal shaders in Mantis Shrimp to create this kind of animations (or “demos”) as well. See these ray-marched cubes (not <a href="https://en.wikipedia.org/wiki/Marching_cubes">marching cubes</a>!):
  598. </p>
  599.  
  600. <video width="256" height="256" controls autoplay loop>
  601.  <source src="http://mantisshrimp.endavid.com/pix/2024-01-29-Genuary-SDF-ray-marching.mp4" type="video/mp4">
  602.  Your browser does not support the video tag.
  603. </video>
  604.  
  605. <p>
  606. Here's the shader code for the cubes example: <a href="https://github.com/endavid/MantisShrimpExamples/blob/main/Genuary2024/29-sdf-raymarching.metal">Genuary 29-sdf-raymarching.metal</a>. You can find other examples I did for the <a href="https://genuary.art">#Genuary</a> challenge in that folder: <a href="https://github.com/endavid/MantisShrimpExamples/tree/main/Genuary2024">endavid/Genuary2024</a>.
  607. </p>
  608.  
  609. <h3>Beyond fragment shaders</h3>
  610.  
  611. <p>
  612. Apart from the fact that the original intent of this app was simply comparing 2 images, it felt strange to allow custom vertex shaders. What would be the point if you can’t change the geometry? I would need a way to upload geometry to Mantis Shrimp. But then, that would be a model viewer, rather than an image diff tool!
  613. </p>
  614.  
  615. <p>
  616. However, being able to play with shaders that do more generic things programmatically is still attractive. That’s why I added support for mesh shaders in version 1.1 of Mantis Shrimp. I will discuss this in the next article, but the basic idea is that you have a mesh shader with no geometry at all as input, and you create your own geometry programmatically in the shader. So you can create 3D graphics procedurally, without necessarily using ray marching and SDF functions in a fragment shader. Here’s an example of some cubes generated in a mesh shader: <a href="https://github.com/endavid/MantisShrimpExamples/blob/main/Genuary2024/10-cubes.metal">Genuary 10-cubes.metal</a>.
  617. </p>
  618.  
  619. <video width="256" height="256" controls autoplay loop>
  620.  <source src="http://mantisshrimp.endavid.com/pix/2024-01-10-Genuary-Hexagonal.mp4" type="video/mp4">
  621.  Your browser does not support the video tag.
  622. </video>
  623.  
  624. <p>
  625. If you use Mantis Shrimp & you like it, please leave me a comment in the App Store. And if you post creations in Twitter or Instagram, use the hashtag <a href="https://twitter.com/hashtag/MantisShrimpApp">#MantisShrimpApp</a> so I can find them 😊
  626. </p>
  627.  
  628. <p>
  629. Happy coding!
  630. </p>
  631. ]]>
  632. </description>
  633. </item>
  634.  
  635. <item>
  636. <link>http://endavid.com/index.php?entry=102</link>
  637. <guid>http://endavid.com/index.php?entry=102</guid>
  638.  
  639. <title>Troubleshooting “Disk not ejected properly” on a LaCie USB-C HDD</title>
  640.  
  641. <pubDate>Tue, 20 Feb 2024 17:38:14 +0000
  642. </pubDate>
  643.  
  644. <description><![CDATA[
  645. <figure>
  646.  <img src="http://endavid.com/pix/screenshots/2024-02-20-Disk-not-ejected-properly.png" alt="Disk not ejected properly on a Mac Mini"/>
  647.  <figcaption>Trying out different cables on a LaCie drive to get rid of ejection errors</figcaption>
  648. </figure>
  649.  
  650. <h3>Endless “Disk not ejected properly” popups</h3>
  651.  
  652. <p>
  653. <a href="https://x.com/endavid/status/1741771652285821367?s=20">On January 1st</a> I woke up my Mac Mini M1 from its sleep and I had hundreds of popups saying “Disk not ejected properly”. I had left my external USB-C Lacie Rugged HDD 5 TB drive plugged in, and something had gone horribly wrong.
  654. </p>
  655.  
  656. <p>
  657. The first annoying thing is getting rid of the popups. I think there was a popup every 2 minutes. During 12 hours, that’s 360 popups 😣 Because they are system notifications, they are not grouped into a single app, so you can’t batch-dismiss them. After some search, I found this magic CLI command to get rid of them all:
  658. </p>
  659.  
  660. <pre>
  661.  killall NotificationCenter
  662. </pre>
  663.  
  664. <p>
  665.  But that didn’t make the problem go away.
  666. </p>
  667.  
  668. <h3>How to reproduce</h3>
  669.  
  670. <p>
  671.  If you search for that message, you will find people with tons of different problems. If you call Apple support, they’ll make you go through several standard procedures, but you will need some test case to verify that it works. In my case, the issue can be 100% reproduced with the following:
  672. </p>
  673. <ol>
  674.  <li> Plug the USB-C HDD to the USB-C port of the Mac (any of the ports)</li>
  675.  <li> Make sure the drive works by reading a file in it.</li>
  676.  <li> Go to the Apple menu and set the Mac to <em>Sleep</em>.</li>
  677.  <li> Wait 4 minutes.</li>
  678.  <li> Make sure the drive turns off (I touch it to make sure it stopped spinning)</li>
  679.  <li> Wake up the Mac. At that point, I get one <em>“Disk not ejected properly”</em> popup.</li>
  680. </ol>
  681.  
  682. <p>
  683.  You will need some patience to do tests, but it’s important to have a test case that you can reproduce every time.
  684. </p>
  685.  
  686. <h3>Things can get worse</h3>
  687.  
  688. <p>
  689.  I ignored the issue because it only happened when the computer went to sleep, so I would disconnect the drive when I finished work. In 2020 I had a similar issue, but at the time I solved it by disabling <em>“Put disks to sleep whenever possible”</em> in the macOS power settings. But that option didn’t help this time.
  690. </p>
  691.  
  692. <p>
  693.  But one day things got worse: I got “Disk not ejected properly” every 2 minutes even while I was working with the disk and the computer was awake. At that point, I called Apple Support (chatted twice, called twice).
  694. </p>
  695.  
  696. <h3>Troubleshooting with Apple Support</h3>
  697.  
  698. <p>
  699.  Unplugging and plugging the drive again didn’t fix the continuous disconnections. After running <em>First Aid</em> on the drive and verifying it’s fine, the Apple support team told me to start the Mac in <em>safe mode</em>, and restart again. That fixed the most pressing problem of the drive disconnecting every 2 minutes, but the issue of the drive disconnecting when the computer goes to sleep persisted.
  700. </p>
  701.  
  702. <p>
  703.  The other thing they will tell you to do is to create a <em>test admin user</em> and try from that account. The issue persisted for me.
  704. </p>
  705.  
  706. <p>
  707.  Apple support suspected the cable and the disk, but I wasn’t totally convinced, because why would restarting the Mac fix the most pressing issue, the ejections every 2 minutes? A few weeks later after that first call with Apple, I got that same problem again and I fixed it with a restart. Why unplugging & plugging the drive doesn’t fix it, but a computer restart does? I did suspect the driver, or my Mac (spoiler: I was wrong).
  708. </p>
  709.  
  710. <p>
  711.  Anyway, let’s remember that the issue wasn’t totally fixed with a restart because when the Mac went to sleep it still caused the abrupt disconnection of the drive.
  712. </p>
  713.  
  714. <h3>Seagate (lack of) customer support</h3>
  715.  
  716. <p>
  717. Because the Apple engineer suspected the cable that came with my LaCie disk, I tried to contact LaCie. LaCie support is now part of Seagate support. They have a chatbot, and eventually you can chat with a person (or perhaps it wasn’t?) I pasted all my notes to an agent and I got this: “I understand that you are having issue while connecting to computer. Am I correct?” 😅 Didn’t I give enough detail?
  718. </p>
  719.  
  720. <p>
  721.  After 30 minutes, the only thing I got was a suggestion to run First Aid. I can use Google and ChatGPT, thanks for nothing 😑 They also let me know that they don’t offer support by email to individual users… (In 2017 I had a similar issue with another of their drives, and I did all the interactions by email at the time.)
  722. </p>
  723.  
  724. <h3>Battery of tests</h3>
  725.  
  726. <p>
  727. I’m going to list down all the things I did to try to isolate the problem. Remember I didn’t know yet what caused it.
  728. </p>
  729.  
  730. <ul>
  731.  <li> Connect the HDD to the Mac mini USB-A port with a USB-C to USB-A cable (the one that comes with the PS5 controller). I left the drive plugged all night and it worked perfectly, so one would think that the drive is fine. </li>
  732.  <li> Connect the HDD with that USB-C to USB-A cable to a Macbook Pro with macOS Catalina. It works perfectly. Unfortunately that Mac doesn’t have a USB-C port so I can’t test the other cable.</li>
  733.  <li> Connect the HDD to the Mac mini USB-C port with a USB-C cable from an Android phone. With that cable, I don’t get alerts saying that it got disconnected, but all of a sudden the files can’t be read. It seems more dangerous, because it appears connected, but actually the files don’t work. I got <em>“The file xxx could not be opened”</em> when I tried to read any file, and I got some corrupt files while I was writing to the drive. So it’s not really working and it’s super dangerous ⚠️ </li>
  734.  <li> Update from macOS Sonoma 14.2.1 to 14.3. It didn’t fix the issue. </li>
  735.  <li> I tried plugging the HDD with the USB-C to USB-A adapter with a USB hub connected to the USB-C port and the drive didn't turn on! I remember it working at some point in the past… Any other device works, though, so perhaps there's not enough power for this drive?</li>
  736. </ul>
  737.  
  738. <p>
  739.  After these tests, I headed to the Apple Store and I did more tests with an engineer at the Genius Bar:
  740. </p>
  741.  
  742. <ul>
  743.  <li> Using the LaCie USB-C cable, we connected the HDD to one of their Macs, a Macbook Air M2 running Sonoma 14.1.2. Doing the sleep test, the issue reproduced. So it seems unrelated to my Mac mini. It’s either the cable, the disk, or the drivers.</li>
  744.  <li> Using a Thunderbolt 4 data cable (£75) the error happened again 😞 So it wasn’t a problem exclusive to my cable.</li>
  745.  <li> Using a 2m Apple 240W thunderbolt charge cable (£29) the error DOES NOT happen 🎊 </li>
  746. </ul>
  747.  
  748. <p>
  749.  But why? The more expensive data cable says it’s 100W, so it seemed to be related to the power. For the time being, I got the thunderbolt charge cable. I left it plugged all night and it was working correctly. But I wanted to know what was going on. And did I really want to use that cable? (It turns out that that cable wasn’t ideal.)
  750. </p>
  751.  
  752. <h3>Handshaking in a 3rd call with Apple</h3>
  753.  
  754. <p>
  755.  I called Apple support again. The engineer explained that when you connect the drive to the computer, there’s a <em>handshake</em> that determines whether to use <em>Thunderbolt</em>, or regular <em>USB-C</em>. Even if the port looks the same, Thunderbolt and USB-C are actually different connections. The Mac must think it’s thunderbolt even if I use a regular USB-C cable. And then it’s when it becomes unstable. When it goes to sleep, it tries to keep a register of the connection so it doesn’t have to do another handshake, or spark a new connection again. Then, when it wakes up it sends data as it was Thunderbolt, when it’s not.
  756. </p>
  757.  
  758. <p>
  759.  That’s the explanation I got and I was suggested to keep using the 240W thunderbolt charge cable. That doesn’t seem to explain why it didn’t work with the Thunderbolt 4 cable, though.
  760. </p>
  761.  
  762. <p>
  763.  Apple told me to speak to the manufacturer.
  764. </p>
  765.  
  766.  
  767. <h3>Speed tests with Seagate support</h3>
  768.  
  769. <p>
  770.  I explained the whole thing to Seagate customer support again, and I got this reply:
  771. </p>
  772.  
  773. <blockquote>
  774.  “Upon checking the issue is from cable not from drive. According to your statement, you have tried with different PC and different cable. You will get the pop up like disk not ejected properly except this 240W thunderbolt charge cable. Kindly use your drive with 240W thunderbolt charge cable. Unfortunately your drive is supported only with this cable - 240W thunderbolt cable.”
  775. </blockquote>
  776.  
  777. <p>
  778.  I asked why do they sell the drive with a cable that doesn’t work, then, but I got no reply.
  779. </p>
  780.  
  781. <p>
  782.  The chat is quite surreal and sometimes I wonder whether I am speaking to a human. Except if it were ChatGPT the grammar would be correct, so I do believe they are human. After 1 hour 40 minutes in the chat, I managed to speak to a more helpful person who asked me to do some speed tests.
  783. </p>
  784.  
  785. <p>
  786.  If you are doing this, I recommend using a big file to get consistent results. I zipped my local Movies folder, and that gave me a 1GB ZIP file. To test the copying speed from the Mac mini SSD drive to the external drive, I used rsync. You can use it like this (“LaCie 5TB” is the name of my HDD):
  787. </p>
  788.  
  789. <pre>
  790.  rsync -ah --progress ~/Movies.zip /Volumes/LaCie\ 5TB
  791. </pre>
  792.  
  793. <p>
  794.  I tried with the Apple cable I got in the Apple store, and with the original LaCie cable:
  795. </p>
  796.  
  797. <ul>
  798.  <li>With 240W thunderbolt charge cable: 36.87 Mbytes/sec</li>
  799.  <li>With USB-C to USB-A adapter cable (for reference): 38.21 Mbytes/sec</li>
  800.  <li>With USB-C cable from LaCie: 123.62 Mbytes/sec</li>
  801. </ul>
  802.  
  803. <p>
  804.  So it’s much slower with the charge cable, as slow as using USB-A 😢 But I couldn’t reliably use the LaCie cable. Or the £75 Thunderbolt 4 data transfer cable. Seagate says the drive is compatible with both Thunderbolt 4 and USB-C. The Mac Mini M1 port is Thunderbolt 3. Seagate escalated this issue and they told me to wait 24 hours.
  805. </p>
  806.  
  807. <p>
  808.  Perhaps the HDD was needing more power than it should and that’s why it wants the 240W cable? Or perhaps the operating system/driver doesn’t know how much power it needs to give to the HDD? Shouldn’t the driver somehow detect this? And the macOS UI certainly could have a better way to close those 360 popups…
  809. </p>
  810.  
  811. <p>
  812.  In the Seagate help page on this issue they say many users have reported this “after updating the macOS”, so I thought perhaps it could be Sonoma-related. Read <a href="https://www.seagate.com/gb/en/support/kb/disk-not-ejected-properly-on-mac/">“Disk not ejected properly on Mac”</a>.
  813. </p>
  814.  
  815. <p>
  816.  I decided to speak again with Apple Support.
  817. </p>
  818.  
  819. <h3>4th call with Apple Support: USB 2.0 & HFS+</h3>
  820.  
  821. <p>
  822.  In this last call with Apple I shared my screen and they helped me troubleshoot and install some things:
  823. </p>
  824.  
  825. <ul>
  826.  <li> We installed the LaCie tool kit to see if they provide firmware updates, but there aren’t any. See <a href="https://www.lacie.com/gb/en/support/downloads/">LaCie Downloads</a> and <a href="https://www.seagate.com/support/downloads/seatools/">Firmware downloads</a>.</li>
  827.  <li> They recommended me to use the <a href="https://apps.apple.com/us/app/blackmagic-disk-speed-test/id425264550?mt=12">Blackmagic Disk Speed Test app</a>   for testing the disk speed. </li>
  828.  <li> They asked me to format the drive again with GUID Partition Map, and <em>HFS+ (Mac OS Extended, Journaled)</em>, with the Disk Utility in macOS Sonoma. I was using APFS, but that’s only recommended for SSD drives, and there is no advantage using it in HDD drives.</li>
  829. </ul>
  830.  
  831. <p>
  832.  We didn’t manage to fix the problem, but I got faster transfers after the formatting. I was getting about 85 Mbytes/sec with APFS (using the Disk Speed Test app; my rsync test was faster), but something around 130 Mbytes/sec with HFS+. See below.
  833. </p>
  834.  
  835. <figure>
  836.  <img src="http://endavid.com/pix/screenshots/2024-02-20-speed-test.jpg" width="420" alt="Speed tests of my HDD LaCie drive using different cables and file systems"/>
  837.  <figcaption>Speed tests of my HDD LaCie drive using different cables and file systems</figcaption>
  838. </figure>  
  839.  
  840. <p>
  841.  The other thing I learned is that <a href="https://www.apple.com/uk/shop/product/MU2G3ZM/A/240w-usb-c-charge-cable-2m">the 240W charging cable</a> is USB 2.0, not USB 3.0. That’s why it’s slower. In contrast, <a href="https://www.apple.com/uk/shop/product/MU883ZM/A/thunderbolt-4-usb%E2%80%91c-pro-cable-1m">the Thunderbolt 4 data cable</a> supports USB 3.0 up to 10 Gbits/second, and it also supports USB 4.0, which it’s 40 Gbits/sec.
  842. </p>
  843.  
  844. <p>
  845.  So the drive worked when I used USB 2.0, in either the USB-A or the USB-C port, but started failing otherwise. I had to speak to Seagate again.
  846. </p>
  847.  
  848. <h3>Drive Replacement</h3>
  849.  
  850. <p>
  851.  After another support chat session with Seagate, they told me there must be something wrong with the drive. Since it was still under the 2-year warranty, they asked me to send it back to them, to an address in the UK (tracked, about £8 in the UK).
  852. </p>
  853.  
  854. <p>
  855.  A week later I got a new drive.
  856. </p>
  857.  
  858. <p>
  859.  I did all the tests several times. I also left the computer sleeping for 30 minutes to be really sure. The problem was gone 🥳
  860. </p>
  861.  
  862. <p>
  863.  Just some notes, in case someone uses the same drive:
  864. </p>
  865.  
  866. <ul>
  867.  <li>I ran the LaCie Setup that comes preinstalled in the drive, and the drive got formatted with GUID Partition Map and HFS+, which matches Apple's suggestion 👍</li>
  868.  <li>The Setup app redirects you to <tt>lyvetst.seagate.com/?sn=…</tt> to register the drive, but that URL does not exist. I edited the link to point to <tt>lyveint.seagate.com</tt> instead, and that worked.</li>
  869.  <li>Disk Speed Test says the speed is: Write 141.2 MB/s, Read 135.5 MB/s 👍</li>
  870. </ul>
  871.  
  872. <p>
  873.  Happy drive now 🥳
  874. </p>
  875.  
  876. <h3>Summary</h3>
  877.  
  878. <p>
  879.  Debugging hardware issues is really a pain. Sometimes it’s easier to live with the faulty drive than trying to get to the bottom of it.
  880. </p>
  881.  
  882. <p>
  883.  However, I must say that Apple support was very helpful and that I learned lots of things in the process. I can’t say the same from Seagate, because we still haven’t learned what piece of their hardware would cause such a strange behaviour. They sent me a working refurbished drive in the end, but I spent a lot of time and some money to resolve an issue that it was their fault to begin with.
  884. </p>
  885.  
  886. <p>
  887.  In any case, I hope this article helps other people struggling with the infamous <em>“Disk not ejected properly”</em> troubleshoot the issue and figure out whether it’s a problem caused by the drive itself or not.
  888. </p>
  889.  
  890. <p>
  891.  Thank you very much to the people in Apple support and Seagate support that gave me all the information I’ve mentioned in this article.
  892. </p>
  893.  
  894. <h3>References</h3>
  895.  
  896. <ul>
  897.  <li> How do dismiss many “disk not ejected properly” notifications: <a href="https://apple.stackexchange.com/a/325548/97810">Apple StackExchange</a></li>
  898.  <li> <a href="https://support.apple.com/en-gb/101847">If you see “disk not ejected properly” on your Mac Pro (2023)</a>  (August 21, 2023). (Not Mac mini!). They also say “Apple is aware of this issue and a resolution is planned for a future macOS update.”</li>
  899.  <li> <a href="https://www.seagate.com/gb/en/support/kb/disk-not-ejected-properly-on-mac/">Disk Not Ejected Properly On Mac Seagate help article.</a></li>
  900.  <li> <a href="https://forums.developer.apple.com/forums/thread/83688?answerId=779406022#779406022">Disk Not Ejected Properly thread on Apple developers forums</a> (I posted some of this information there)</li>
  901. </ul>
  902. ]]>
  903. </description>
  904. </item>
  905.  
  906. <item>
  907. <link>http://endavid.com/index.php?entry=101</link>
  908. <guid>http://endavid.com/index.php?entry=101</guid>
  909.  
  910. <title>My retrospective of 2023</title>
  911.  
  912. <pubDate>Sun, 21 Jan 2024 21:47:48 +0000
  913. </pubDate>
  914.  
  915. <description><![CDATA[
  916. <figure>
  917.  <img src="http://endavid.com/pix/screenshots/2024-01-21-Retrospective2023.jpeg" alt="Happy new year & a visual summary of my 2023"/>
  918.  <figcaption>Happy new year 2024 🐲 & a visual summary of my 2023</figcaption>
  919. </figure>
  920.  
  921. <h3>My 2023 as a story</h3>
  922.  
  923. <p>
  924. 2023 has been another year of ups and downs. The worst happened at the end of the year, when I lost my job. I’ve already blogged about it in <a href="http://endavid.com/index.php?entry=100">“8+ years programming for fashion”</a>. The best of 2023 happened in June, when <em>I got married 👨‍❤️‍👨</em>. But many other good things happened as well.
  925. </p>
  926.  
  927. <p>
  928. I’ve been quite busy with job interviews this month, and it feels I’m lagging behind lots of things. I finished <a href="https://adventofcode.com">Advent of Code</a> in January, and I’m 12 days behind in <a href="https://genuary.art">Genuary</a>. But I guess it’s good to be busy. But I didn’t want January to end without having written a retrospective of 2023. I think it’s always a good exercise to stop and look back, specially to celebrate the achievements. Otherwise it’s easy to get lost in the worries of the present, as if everything is always dark.
  929. </p>
  930.  
  931. <p>
  932. Speaking of darkness, I can’t believe last year we spent the new year in Tel Aviv, and this year there’s war going on there… It seems the number of ongoing wars only keeps increasing… ☹️ Let’s go back to positive things.
  933. </p>
  934.  
  935. <p>
  936. Apart from our wedding, another highlight of my 2023 were the trip<em>S</em> (plural) to Japan. After almost 5 years, I managed to go to Japan and renew my residence card. I was quite stressed about that since the beginning of the pandemic, so I’m so happy I was able to solve that.
  937. </p>
  938.  
  939. <p>
  940. In the first trip, I went for 3 weeks, in February & March. I took some days off, but I also worked remotely. I woke up at 4am, worked from 6am to 10am, then went out & enjoy daylight, and then worked again from 4pm to 10pm. That time in the evening I could have meetings with the team in the UK. I felt a bit tired, but I wasn’t sleepy. I think it was all so exciting that it felt quite nice to make the most of my day.
  941. </p>
  942.  
  943. <p>
  944. One random day I saw a random poster of a <em>Kabuki play of Final Fantasy X</em>. Kabuki is a traditional all-male theater, so at first I thought it was quite funny to see the male actors “cosplaying” as some of the female leads. But then I thought this was a very unique opportunity, and I was lucky enough to be there at the time, since it was shown only for a month and a half. So I bought a ticket, I took an extra day off, and I went to see it. It was very long, from 12pm to 9pm, but it was really worth it. One of the best things I’ve ever seen. I was very moved. I wrote about it in my “otaku” blog — in Spanish, though: <a href="https://www.focotaku.com/2023/03/final-fantasy-x-kabuki.html">Final Fantasy X Kabuki</a>.
  945. </p>
  946.  
  947. <p>
  948. I visited Japan again in May. It was a very short trip, just 3 nights, tagging along my partner’s work. But our flight back to London was cancelled so we stayed 2 extra nights. Perfect for me 😂 I had brought my work laptop, and there was a WeWork office next to the hotel, so I went there one day. I was able to meet some friends I couldn’t see in March. Again, I felt I made most of my time there. I was very tired after a very long flight, but I arrived in the morning and I directly went to Akihabara, met a friend, he showed me some nice interesting retro places, I did some shopping, then we went back to Yokohama, and there we had a really good time in a local sushi restaurant. I uploaded a YouTube video (Spanish again, but with English subtitles) with the talk we had in that restaurant, talking about my time in the Japanese game industry: <a href="https://youtu.be/-PVlhIXZ_EM?si=FPm67E0q2mK7kxmC">Gossip: game devs in Japan, with Ahgueo & Fibroluminique</a>.
  949. </p>
  950.  
  951. <p>
  952. As I said earlier, I got married/civil-partnered in June. Although nothing has changed (we’ve been 14 years together), it was nice to have some friends & family over to celebrate. It was a small ceremony, but we all had a great time. My brother came with his family, and I spent a few fun days with them in London.
  953. </p>
  954.  
  955. <p>
  956. In September, we visited Barcelona and we had another celebration there with friends and family, in a Catalan <a href="https://en.wikipedia.org/wiki/Masia">masia</a>. And then my sister visited us in November for my birthday and we saw the early Christmas lights in London & ate lots of delicious Asian food & Japanese desserts. For Xmas, we went to Barcelona again and we had a lovely time there. We usually avoid Xmas, because flights are always very expensive and very crowded, so it was nice to spend <em>Xmas at home</em> after a while.
  957. </p>
  958.  
  959. <p>
  960. Time to rewind and list the achievements.
  961. </p>
  962.  
  963. <h3>Work</h3>
  964.  
  965. <p>
  966. Lots of <em>Generative AI</em> this year. I thought we were doing quite well, but stuff happens… Read summary in here: <a href="http://endavid.com/index.php?entry=100">“8+ years programming for fashion”</a>.
  967. </p>
  968.  
  969. <h3>Indie development</h3>
  970.  
  971. <p>
  972. I’m still demotivated with game development. I have some games I wanted to build in my backlog, but it feels a bit pointless if no one is going to play them… But I did release some things:
  973. </p>
  974.  
  975. <ul>
  976. <li> I released <a href="http://syllabits.endavid.com">Syllabits</a> for the browser, Windows and Mac. It’s in <a href="https://endavidg.itch.io/syllabits">itch.io</a>. And almost every Friday evening we’ve been playing it at work, in the Practice mode. I’m glad they liked it in my office! 🥹 </li>
  977. <li>In summer I released maintenance updates for all my iOS games. No new features. </li>
  978. <li>In December I released a new macOS app called <a href="http://mantisshrimp.endavid.com">Mantis Shrimp</a>. It was <em>listed 5th</em> in the Top Paid macOS Developer Tools this month! Although only briefly. It was something I was making on the side and that I used at work to compare images. I’ve been doing the Genuary challenge with it, and it’s fun. See my <a href="https://www.instagram.com/p/C149_GlIh6u/?hl=en">#Genuary1 in Instagram</a>.</li>
  979. </ul>
  980.  
  981. <h3>Blogging</h3>
  982.  
  983. <p>
  984. I didn’t write any technical articles in 2023 😓 Well, I did write one article for my company, but it didn’t get the green light, so it will remain unpublished.
  985. </p>
  986.  
  987. <p>
  988. In the Spanish geek blog, the only thing interesting is the article about the FFX New Kabuki I mentioned earlier.
  989. </p>
  990.  
  991. <h3>Vlogging</h3>
  992.  
  993. <p>
  994. This year I decided to spend more time gaming, and I streamed most of my plays on Twitch. I did upload some YouTube videos as well, but I intended to do more technical ones, and I failed to deliver. All my videos are usually in Spanish, but I’ve added some English subtitles on request.
  995. </p>
  996.  
  997. <ul>
  998. <li>I did a couple of rants on ChatGPT, and WWDC23 in my “serious” channel: <a href="https://www.youtube.com/@endavidg/videos">@endavidg</a>.</li>
  999. <li>I uploaded a couple of solutions to Advent of Code 2022 (but not 2023 yet…) in my channel on Swift programming: <a href="https://www.youtube.com/@algoDeSwift/videos">@algoDeSwift</a>.</li>
  1000. <li>I did upload lots of videos to <a href="https://www.youtube.com/@focotaku/videos">@focotaku</a> 🙈 But it was mainly an edit of my play-through of Final Fantasy XIII from 2022. I thought it was good for learning because I was playing it in Japanese and I wrote some language notes in each one of the video descriptions. </li>
  1001. <li>I did upload some other types of videos to <a href="https://www.youtube.com/@focotaku/videos">@focotaku</a>, like book reviews of Japanese novels, and some other game reviews. But my most viewed video is one a bit off from the “otaku” culture. It’s <a href="https://youtu.be/fqJqjUqQs8s?si=V34Ml36-pdrfxV4C">a summary I did about the BBC documentary on the scandal involving Johnny Kitagawa</a> (it has English subtitles), the deceased top figure in Japanese music industry / male model agency. The video has 850 views --perhaps not much, but quite a record for me. I’m glad it’s been informative. I did 3 videos on the topic and they are all quite visited.</li>
  1002. </ul>
  1003.  
  1004. <h3>Learning</h3>
  1005.  
  1006. <ul>
  1007. <li>More <em>Python</em> & some <em>C++</em> at work</li>
  1008. <li>I did all the Advent of Code 2023 in <em>Swift</em>, and I did one of the problems in <em>Godot 4</em> as well, just for the fun of visualizing it: <a href="https://www.instagram.com/p/C1-Mf5II4rs/">AoC 2023 Day 22 on Instagram</a>.</li>
  1009. <li>I practiced some <em>Metal</em> again to create <a href="http://mantisshrimp.endavid.com">Mantis Shrimp</a>.</li>
  1010. </ul>
  1011.  
  1012. <h3>Leisure</h3>
  1013.  
  1014. <p>
  1015. I already mentioned several trips in the first section. I wanted to catch up with some JRPG titles, so I did play & cleared these games:
  1016. </p>
  1017.  
  1018. <ul>
  1019. <li><em>Nier Replicant</em> — I enjoyed it very much. The soundtrack is amazing. There’s are some concerts in Europe in February, including Barcelona, but I failed to get tickets… 😢</li>
  1020. <li><em>Nier Automata</em> — amazing as well. </li>
  1021. <li><em>Final Fantasy XVI</em> — Epic! It was a present from a friend, and then I also got a PS5 as a present! 🤩 I’m actually replaying it because I enjoyed it very much.</li>
  1022. </ul>
  1023.  
  1024. <p>
  1025. I wasn’t planning to read any novels, but I ended up reading 3 novels in Japanese by the same author, <em>Keigo Higashino</em>. My partner has lots of his novels at home and I got curious. I was surprised I could read them relatively easily. When I tried reading novels in Japanese in the past, it has been usually hard. But the difficult novels were SF or JRPG-related, so the vocabulary may have been weird… I actually read another novel in Japanese, “Egoist”, an LGBT story 🏳️‍🌈, and started another one about NieR, so it’s my record on Japanese novels reading so far! I also read a novel in English called <em>“Tomorrow and Tomorrow and Tomorrow”</em>, about game development. It was really moving. See my <a href="http://endavid.com/lists/readings08.html">list of readings</a>.
  1026. </p>
  1027.  
  1028.  
  1029. <h3>Wishes for 2024</h3>
  1030.  
  1031. <p>
  1032. Again, let’s hope 2024 brings <em>peace ☮️🕊</em>
  1033. </p>
  1034.  
  1035. <p>
  1036. I haven’t written down my personal goals yet, but I hope I can write some technical blog posts this year. I have one planned about Metal and Mesh shaders, related to the development of <a href="http://mantisshrimp.endavid.com">Mantis Shrimp</a>.
  1037. </p>
  1038.  
  1039. <p>
  1040. I also want to upload some videos about Swift development.
  1041. </p>
  1042.  
  1043. <p>
  1044. I started the year by joining a badminton group 🏸, so hopefully I do more exercise this year!
  1045. </p>
  1046.  
  1047. <p>
  1048. And I hope to visit Japan again some time this year.
  1049. </p>
  1050.  
  1051. <p>
  1052. Best wishes to everyone. My dragon-themed new year greetings card at the top of this article (I was also lagging behind on my new year greetings… I started drawing them already in January…)
  1053. </p>
  1054.  
  1055. <p>
  1056. Again, <em>Happy New Year 2024 🐲</em>
  1057. </p>
  1058. ]]>
  1059. </description>
  1060. </item>
  1061.  
  1062. <item>
  1063. <link>http://endavid.com/index.php?entry=100</link>
  1064. <guid>http://endavid.com/index.php?entry=100</guid>
  1065.  
  1066. <title>8+ years programming for fashion</title>
  1067.  
  1068. <pubDate>Mon, 01 Jan 2024 12:37:55 +0000
  1069. </pubDate>
  1070.  
  1071. <description><![CDATA[
  1072. <figure>
  1073.  <img src="http://endavid.com/pix/screenshots/2024-01-01-David-Metail-summary.jpg" alt="A visual summary of some of the stuff I've worked on at Metail"/>
  1074.  <figcaption>A visual summary of some of the stuff I've worked on at Metail</figcaption>
  1075. </figure>
  1076.  
  1077. <h3>A retrospective</h3>
  1078.  
  1079. <p>
  1080. My time at Metail has come to an end, and I thought it was a good time to write a retrospective longer than usual. It’s been an interesting ride, so I think it’s worth to look back and review the things I learned.
  1081. </p>
  1082.  
  1083. <p>
  1084. 8 years and a half is the longest I’ve ever been in a company. The main reason for staying this long has been the incredibly welcoming atmosphere and the friendly people that worked there, but there were also very interesting technical challenges that aligned well with my technical background. Let me give you a bit more detail.
  1085. </p>
  1086.  
  1087. <h3>How I ended up here</h3>
  1088.  
  1089. <p>
  1090. I’m a graphics programmer and before Metail I’ve always worked in the games industry. The games industry is full of amazing people, and very passionate as well. I suppose passion can sometimes transform into angry faces and needless shouting. The year I left the games industry I wasn’t going through the best of times. My mum was fighting cancer, and then she got a brain stroke. So it was hard coping with the stress of work. My counsellor suggested that a change may be good.
  1091. </p>
  1092.  
  1093. <p>
  1094. I got a random phone call from a recruiter and they talked about Metail. I wasn’t very interested in fashion at the time, but the technology that they described sounded quite interesting. They needed someone with knowledge in Computer Graphics, but also with a good understanding of Computer Vision and Image Processing. Although professionally I had mostly worked on graphics and optimization, my PhD was on Computer Vision and Image Processing. So this sounded like a nice combo! I passed the interviews and I got in.
  1095. </p>
  1096.  
  1097. <h3>The MeModel era</h3>
  1098.  
  1099. <p>
  1100. The main product when I entered Metail was called the MeModel (see figure on top). It was a virtual try-on system where users would enter their measurements to generate a virtual avatar, and then they could try on clothes. It was a web application that retailers could install in their websites. The technology was a mixture of 2D, photographs of clothes and faces, and 3D for the body shapes. The garment physics were done in 2D.
  1101. </p>
  1102.  
  1103. <p>
  1104. The technology I was maintaining was a server-side renderer written in DirectX 10, C#, and C++. After getting familiar with the pipeline and the asset publishing, I started by optimizing the performance by removing redundant textures and unnecessary processing. Sometimes graphics code becomes spaghetti 🍝, but a simple PIX GPU frame capture can reveal very interesting things quite easily.
  1105. </p>
  1106.  
  1107. <p>
  1108. I also worked on improving the visuals. I introduced a new avatar with more joints and I contacted an ex-colleague to help us author more poses. I changed the skin shaders, and I wrote a WebGL tool to help us tweak the skin to match the photographic heads (see <a href="https://medium.com/real-time-rendering/skin-colour-authoring-using-webgl-101a05865bdd">Skin colour authoring using WebGL</a>).
  1109. </p>
  1110.  
  1111. <p>
  1112. I also did some server-side work. Because I had some previous experience with NodeJS, I suggested building a small server in NodeJS for scaling and monitoring. This sat on top of AWS services, but it let us do more complex logic suited to our renderer. The new bottleneck was the starting time of the renderer service —it took several minutes to boot. I looked at some old spaghetti code and rewrote it into a math paper, and then rewrote the whole thing with simpler matrix multiplications. Also, I turned most of the asset loading into lazy initializations, for a final starting time under 2 seconds.
  1113. </p>
  1114.  
  1115. <p>
  1116. I built several internal visualization tools, and other tools for the outsourcing teams to help them see what they were creating, for faster iterations (from days to hours). I also became Engineering Manager of a team of 7 and I mentored other developers. I did lots of interesting things.
  1117. </p>
  1118.  
  1119. <p>
  1120. Unfortunately, the MeModel didn’t quite take off and the company struggled financially until we were acquired by one of our investors.
  1121. </p>
  1122.  
  1123. <h3>The EcoShot era</h3>
  1124.  
  1125. <figure>
  1126.  <img src="http://endavid.com/pix/screenshots/2024-01-01-David-EcoShot.jpg" alt="A scan of myself, my scanatar superimposed on a photograph, and a couple of EcoShot renders"/>
  1127.  <figcaption>From left to right: a scan of myself, my scanatar superimposed on a photograph, and a couple of EcoShot renders</figcaption>
  1128. </figure>
  1129.  
  1130. <p>
  1131. When we shut down the MeModel service I was working on an idea from our CTO. He thought that in order to strive for realism, we needed to do the garment simulation in 3D. We were experimenting with some 3D CAD cloth authoring software at the time, and I thought it would be relatively simple to reuse all the technology we had to create something for that software.
  1132. </p>
  1133.  
  1134. <p>
  1135. Unfortunately, all the client-side developers had to go. So I had to build everything on my own. But the CAD software let you write plugins in Python, so it wasn’t quick to get started. I like C++, but Python let us build things faster in this scenario.
  1136. </p>
  1137.  
  1138. <p>
  1139. I started by getting a body scan of myself and using our software to automatically rig it, add some poses, and import it into the software. That's what we call a <em>“scanatar”</em>, i.e. an avatar originated from a scan. When I saw the draping of a single garment in different sizes on an accurate model of my body, I thought this would be a game changer.
  1140. </p>
  1141.  
  1142. <p>
  1143. I built a beta of the software in a couple of months, all self-contained —there was no service at the time. After the beta, I worked with the network architect to build a service. I built a renderer that used V-Ray to render the garments using raytracing. For the 2D composition I used mainly ImageMagick, and some OpenCV scripts written by our R&D team.
  1144. </p>
  1145.  
  1146. <p>
  1147. Apart from EcoShot, we worked on other projects, such as the European <a href="https://etryon-h2020.eu">eTryOn project</a> (see my <a href="https://www.youtube.com/live/yNpV6w7PYZw?si=5_2SiDuT2Lkas-rk&t=7200">XR4Fashion talk from 2:00:00, From 3D designs to Lens Studio: Challenges in faithful garment representation</a>), and some other AR collaborations with Snap (see me wearing a virtual Puma tracksuit in the figure on top, using one of my Snapchat filters). So I got to touch some game engines as well, like Unity or Lumberyard, and Lens Studio (see some mentions in <a href="https://medium.com/real-time-rendering/reasons-for-a-solo-dev-to-love-godot-engine-80447b2854a0">Reasons for a solo dev to love Godot Engine</a>).
  1148. </p>
  1149.  
  1150. <p>
  1151. 2023 has been an interesting year as well with the boom of Generative AI (GenAI for short). I worked on releasing new features and new GenAI avatars for the EcoShot plugin at a very fast pace. Many customers were impressed by the results, and we've been getting requests for new imagery.
  1152. </p>
  1153.  
  1154. <h3>The End & The Future</h3>
  1155.  
  1156. <p>
  1157. Unfortunately, we ran out of time. EcoShot will continue to exist in the hands of Tronog (see the announcement: <a href="https://metail.com/blog/metail-tronog-partnership/">Metail and Tronog enter into a strategic partnership to transfer EcoShot and make AI-generated fashion accessible to all</a>). By the way, can you tell which models are real and which are GenAI?
  1158. </p>
  1159.  
  1160. <figure>
  1161.  <img src="http://endavid.com/pix/screenshots/2024-01-01-GenAI-EcoShot.jpg" alt="Image from Metail website showing some EcoShot & GenAI models"/>
  1162.  <figcaption>Image from Metail website showing some EcoShot & GenAI models</figcaption>
  1163. </figure>
  1164.  
  1165. <p>
  1166. There are still many exciting things to come for EcoShot in 2024, but I will be moving on. At the time of writing, I don’t know yet where to, though. It seems it was still early for many apparel companies to adopt 3D, so I may not be working again in fashion. Who knows.
  1167. </p>
  1168.  
  1169. <p>
  1170. I was attracted to the idea of doing something good for the planet. The fashion industry, specially fast fashion, is a machinery of creating waste. Creating virtual samples before they are manufactured should help reduce some waste. Also, showing customers how the garment fits in different body shapes should help reduce returns. But these technologies still have a slow adoption. I hope that GenAI will revolutionize that.
  1171. </p>
  1172.  
  1173. <p>
  1174. While I look for my next adventure, I will be working on some side projects. I recently released an image diff app called <a href="http://mantisshrimp.endavid.com">Mantis Shrimp</a> 🦐. I borrowed the name from a Javascript web tool made by my team lead when I entered Metail. He loves Mantis Shrimps because of the 16 or so photoreceptor cells in their eyes. I thought it was a nice way of coming back full circle.
  1175. </p>
  1176.  
  1177. <p>
  1178. So long and thanks for all the fish 🐋🌈
  1179. </p>
  1180.  
  1181. <p>
  1182. Happy New Year 2024 🐲
  1183. </p>
  1184. ]]>
  1185. </description>
  1186. </item>
  1187.  
  1188. </channel>
  1189. </rss>
  1190.  

If you would like to create a banner that links to this page (i.e. this validation result), do the following:

  1. Download the "valid RSS" banner.

  2. Upload the image to your own server. (This step is important. Please do not link directly to the image on this server.)

  3. Add this HTML to your page (change the image src attribute if necessary):

If you would like to create a text link instead, here is the URL you can use:

http://www.feedvalidator.org/check.cgi?url=http%3A//endavid.com/kblog/endavid.feed.xml

Copyright © 2002-9 Sam Ruby, Mark Pilgrim, Joseph Walton, and Phil Ringnalda