-
Notifications
You must be signed in to change notification settings - Fork 0
/
feed.xml
644 lines (354 loc) · 50.5 KB
/
feed.xml
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
<?xml version="1.0" encoding="utf-8"?><feed xmlns="http://www.w3.org/2005/Atom" ><generator uri="https://jekyllrb.com/" version="4.3.1">Jekyll</generator><link href="https://alia-traces.github.io/feed.xml" rel="self" type="application/atom+xml" /><link href="https://alia-traces.github.io/" rel="alternate" type="text/html" /><updated>2024-08-18T12:08:34+01:00</updated><id>https://alia-traces.github.io/feed.xml</id><title type="html">Catnip codes</title><subtitle>Blessedly cursed code: realtime rendering, path tracing, demoscene, livecoding. Maybe other stuff, who knows.</subtitle><entry><title type="html">Green Shoots (Executable Graphics)</title><link href="https://alia-traces.github.io/demoscene/metal/pathtracing/2021/12/31/green-shoots.html" rel="alternate" type="text/html" title="Green Shoots (Executable Graphics)" /><published>2021-12-31T11:46:20+00:00</published><updated>2021-12-31T11:46:20+00:00</updated><id>https://alia-traces.github.io/demoscene/metal/pathtracing/2021/12/31/green-shoots</id><content type="html" xml:base="https://alia-traces.github.io/demoscene/metal/pathtracing/2021/12/31/green-shoots.html"><![CDATA[<p><img src="/images/green_shoots/Alia_-_Green_shoots_screenshot.jpg" alt="Screenshot" /></p>
<p>Released at Hogmanay 2021, Executable Graphics (a single compute shader generates this image, with no external data). Released as an image only, since it takes multiple hours to render.
<!--more--></p>
<h2 id="background">Background</h2>
<p>New Year’s Eve was approaching, and with the Omicron covid variant on the rise many of us didn’t want to risk partying. Luckily reality404, Aldroid and Crypt organised an online streaming event called Silvester. This was an online demoparty with competitions, so I decided to make something but as I didn’t have much time executable graphics made sense.</p>
<p>With the backdrop of 2 years of pandemic and lockdowns, having not seen friends in far too long and spending another New Years at home I wanted to do something with a positive message. And with vaccination it was looking hopeful that maybe in 2022 things might start getting back to normal… could this be the first green shoots of recovery?</p>
<h2 id="brute-force-rendering-of-micro-geometry">Brute force rendering of micro-geometry</h2>
<p>I’ve been interested in brute force rendering complex materials for a while - <a href="https://alia-traces.github.io/demoscene/metal/pathtracing/2020/06/20/cloth.html">see ‘Cloth’ for example - brute force rendering of cloth at fibre level</a>. I’d also been thinking about ‘microfacet’ rendering, which is where you simulate the way light bounces around in the microscopic lumps and cracks that exist in many materials.</p>
<p>Normally microfacets are approximated, because they’re too small to see. But what if…</p>
<p>So I built a quick test setup using SDF geometry and raymarching (plus path tracing) that fills a volume with packed reflective spheres:</p>
<p><img src="/images/green_shoots/Microfacets.png" alt="Microfacets from real geometry" /></p>
<p>It works! Despite there only being reflective surfaces, once the spheres are too small to see the result is effectively a matte material.</p>
<p>Comparing to a standard matte material, the differences are quiet obvious:</p>
<p><img src="/images/green_shoots/Microfacets_vs_matte.jpg" alt="Microfacets compared to matte material" /></p>
<p>It’s darker because as the light bounces around inside the ‘cracks’ more energy is absorbed. The colour is richer too, because each bounce reflects orange light and absorbs other colours, making it more orange. And there’s a fresnel effect - rays that glance the surface are more likely to be reflected out and rays that hit head-on are more likely to fall into a crack.</p>
<h2 id="rendering-snow">Rendering snow</h2>
<p>I’m once again using my own path tracer for this, and the scene is procedurally generated with signed distance functions.</p>
<h3 id="snow-volume">Snow volume</h3>
<p>The overall shape of the snow is just a plane with some sine waves added to give it ripples. I wanted the plants to look like they’d melted the snow around them, so I think removed 3 cylinders with a smooth subtractive blend - this gives a nice smooth curve. To get the ‘ridges’ between the plants, I just took the minimum distance to the 3 cylinders (giving a hard cut-off between them) then used the smooth subtract on that to give the curves into the snow too.</p>
<p><img src="/images/green_shoots/Snow_shape.jpeg" alt="The shape of the "snow volume"" /></p>
<h3 id="snow-flakes">Snow flakes</h3>
<p>I can’t find a screenshot, but they’re a 3d hexagonal snowflake. I didn’t bother to randomise the shape of every flake, but it’s easily doable.</p>
<h2 id="plant-shoots">Plant shoots</h2>
<h3 id="geometry">Geometry</h3>
<p>The shoots are procedural. I looked at some photos of the kind of plant I had in mind, and the leaves grow in rings out from the center.</p>
<p>So I started out with 3 concentric tubes (giving 3 leaves), plus a thin outer “sheath” tube with a bit of sin wave added to the top:</p>
<p><img src="/images/green_shoots/shoot1.jpeg" alt="The initial stage of plant generation, just green tubes" /></p>
<p>Then I removed part of the circle for each tube:</p>
<p><img src="/images/green_shoots/shoot2.jpeg" alt="The tubes are now only part tubes, as if they had been sliced in half" /></p>
<p>Finally I sheared them (by offseting on the Y axis) to form the leaf shape:</p>
<p><img src="/images/green_shoots/shoots_final.jpeg" alt="The last stage, the tubes are sheared to form a leaf shape" /></p>
<h3 id="material">Material</h3>
<p>The base material is just matte green, slightly lighter for the sheath:</p>
<p><img src="/images/green_shoots/shoots_material1.jpeg" alt="The last stage, the tubes are sheared to form a leaf shape" /></p>
<p>Stripes are added, these are lighter and slightly shinier:</p>
<p><img src="/images/green_shoots/shoots_material2.jpeg" alt="The last stage, the tubes are sheared to form a leaf shape" /></p>
<p>Finally the tops are tinted yellow. This is done by blending into a yellow material based on the normal.</p>
<p><img src="/images/green_shoots/shoots_final.jpeg" alt="The last stage, the tubes are sheared to form a leaf shape" /></p>]]></content><author><name></name></author><category term="demoscene" /><category term="metal" /><category term="pathtracing" /><summary type="html"><![CDATA[Released at Hogmanay 2021, Executable Graphics (a single compute shader generates this image, with no external data). Released as an image only, since it takes multiple hours to render.]]></summary></entry><entry><title type="html">Maskless, capeless, free (4KB Executable Graphics)</title><link href="https://alia-traces.github.io/demoscene/metal/pathtracing/2020/12/31/maskless-capeless-free.html" rel="alternate" type="text/html" title="Maskless, capeless, free (4KB Executable Graphics)" /><published>2020-12-31T11:46:20+00:00</published><updated>2020-12-31T11:46:20+00:00</updated><id>https://alia-traces.github.io/demoscene/metal/pathtracing/2020/12/31/maskless-capeless-free</id><content type="html" xml:base="https://alia-traces.github.io/demoscene/metal/pathtracing/2020/12/31/maskless-capeless-free.html"><![CDATA[<p><img src="/images/maskless_capeless_free/maskless_capeless_free_screenshot.jpg" alt="Screenshot" /></p>
<p>Released at Hogmanay 2020, 4KB Executable Graphics (a single 4KB executable file generates this image, with no external data).</p>
<p><a href="https://demozoo.org/graphics/288426/">Download on Demozoo</a>
<!--more--></p>
<h3 id="a-sequel-to-kept-in-a-box">A sequel to ‘<a href="/demoscene/metal/pathtracing/2020/12/31/kept-in-a-box.html">Kept in a Box</a>’</h3>
<p>‘Kept in a Box’ felt important to make, but it also felt very dark, and maybe not an ideal piece to release at a New Year’s Eve party. So I wanted to do a second piece that was lighter, and decided to make a sequel.</p>
<p>Where Kept in a Box was about staying in the closet voluntarily for the protection of others, this is about finding a way out of that situation.</p>
<p>The hero or heroine is no longer needed, drops their mask and cloak, takes the key and walks out through the open door into the light of freedom. The mask and cape represent both the heroic nature of the imprisonment, and the disguise they no longer need now they can show the world their true self.</p>
<h3 id="background">Background</h3>
<p>The scene setup is the same as Kept in a Box - a light outside the scene, low this time to cast a beam of light through the door and with a bright sky to provide more ambient light. There’s no room again, just one wall with a rectangle removed to form the doorway and a box to form the door.</p>
<p><img src="/images/maskless_capeless_free/door.jpg" alt="Screenshot" /></p>
<p>You can also see the floor ‘texture’ here - I’m setting the material roughness randomly. It’s pixellated, but that doesn’t matter because it’s covered.</p>
<h2 id="cloth">Cloth</h2>
<p>The cloth is an SDF. The material is matt red, but with a slight reflective fresnel layer to give it a little sheen. It’s a flat disk:</p>
<p><img src="/images/maskless_capeless_free/cloth1.jpg" alt="Screenshot" /></p>
<p>To which I just added some sine waves together and added them to the height, it’s cheap and nasty and looks like a wave pool here but looks like rumpled cloth in the end 🤷♀️</p>
<p><img src="/images/maskless_capeless_free/cloth2.jpg" alt="Screenshot" /></p>
<p>For fine detail I just added another couple of sine waves at right angles to make it look a bit like a cloth texture.</p>
<p><img src="/images/maskless_capeless_free/cloth3.jpg" alt="Screenshot" /></p>
<p>The last step is to add some distortion parallel to the floor so it doesn’t look obviously round.</p>
<p><img src="/images/maskless_capeless_free/cloth4.jpg" alt="Screenshot" /></p>
<h2 id="mask">Mask</h2>
<p>The mask is just part of a sphere, with a few boxes blended in to give it a face shape and some holes punched through for the mouth and eyes. I gave it a bit of thickness to make it look solid.</p>
<p><img src="/images/maskless_capeless_free/mask.gif" alt="Screenshot" /></p>
<p>The material is a roughened gold. It’s the same technique as the floor - I just use the surface position to generate a random roughness value.</p>
<h2 id="post">Post</h2>
<p>The post setup is similar to last time. Raw output:</p>
<p><img src="/images/maskless_capeless_free/post1.jpg" alt="Screenshot" /></p>
<p>Increased contrast:</p>
<p><img src="/images/maskless_capeless_free/post2.jpg" alt="Screenshot" /></p>
<p>This time I wanted warm, rich colours so I increased both saturation and warmth:</p>
<p><img src="/images/maskless_capeless_free/post3.jpg" alt="Screenshot" /></p>
<p>Finally depth of field and vignette:</p>
<p><img src="/images/maskless_capeless_free/post4.jpg" alt="Screenshot" /></p>]]></content><author><name></name></author><category term="demoscene" /><category term="metal" /><category term="pathtracing" /><summary type="html"><![CDATA[Released at Hogmanay 2020, 4KB Executable Graphics (a single 4KB executable file generates this image, with no external data). Download on Demozoo]]></summary></entry><entry><title type="html">Kept in a box (4KB Executable Graphics)</title><link href="https://alia-traces.github.io/demoscene/metal/pathtracing/2020/12/31/kept-in-a-box.html" rel="alternate" type="text/html" title="Kept in a box (4KB Executable Graphics)" /><published>2020-12-31T10:46:20+00:00</published><updated>2020-12-31T10:46:20+00:00</updated><id>https://alia-traces.github.io/demoscene/metal/pathtracing/2020/12/31/kept-in-a-box</id><content type="html" xml:base="https://alia-traces.github.io/demoscene/metal/pathtracing/2020/12/31/kept-in-a-box.html"><![CDATA[<p><img src="/images/kept_in_a_box/kept_in_a_box_screenshot.jpg" alt="Screenshot" /></p>
<p>Released at Hogmanay 2020, 4KB Executable Graphics (a single 4KB executable file generates this image, with no external data).</p>
<p><a href="https://demozoo.org/graphics/288426/">Download on Demozoo</a>
<!--more--></p>
<h3 id="meaning">Meaning</h3>
<p>This was inspired by a particular group of people: transgender women who either accept or figure out their true identity later in life, after settling and starting a family, and choose to stay in the closet either to protect that family or to keep it together. They know how to leave their prison, they hold the key, but they choose to repress.</p>
<p>Identity needs an outlet. Unable to live as themselves offline, social media provides this outlet for many - a space to present as themselves, find others, and support each other.</p>
<p>However, I don’t intend this piece to be about this group in particular and avoided visual hints about who the characters may be. There are many others in this situation, for many reasons. This is dedicated to you all.</p>
<h3 id="lighting">Lighting</h3>
<p>Since the focus of the piece is about support as well as being in a dark place, I wanted to represent that and chiaroscuro (strongly contrasting light and dark) felt perfect.</p>
<p>I wanted the scene to be very dark, with a beam of light in the centre falling on the key and the hands, with everything else lit indirectly. Since the subject is a form of imprisonment, I went with light shining through a high, barred window.</p>
<p>To do this, I put a light far away from the scene:</p>
<p><img src="/images/kept_in_a_box/lighting1.jpg" alt="Screenshot" /></p>
<p>Then, I added a plane between the key and the light. I used a simple 2d distance function to cut a few rectangles out to let the light through.</p>
<p><img src="/images/kept_in_a_box/lighting2.jpg" alt="Screenshot" /></p>
<p>There isn’t actually a room at all - just the floor and one wall.</p>
<p>I use biased rendering here, meaning I shoot rays out of the camera into the scene, and each time the ray intersects a surface, I cast a ray towards the light rather than just hoping the ray will randomly hit it.</p>
<p>Since the light is far away and mostly blocked by a wall, rendering time would be extremely long otherwise.</p>
<h3 id="modeling">Modeling</h3>
<p>There are 3 models here, all made with SDFs (signed distance functions). A key, one human, and a pair of hands.</p>
<h2 id="key">Key</h2>
<p>It’s basic, but it’s small and far from the camera, no more is needed.</p>
<p><img src="/images/kept_in_a_box/key.jpg" alt="Screenshot" /></p>
<h2 id="human">Human</h2>
<p>This is a bit more complex, apart from the head shape (a distorted sphere) everything is made from blended boxes. Here’s how it’s built:</p>
<p><img src="/images/kept_in_a_box/character.gif" alt="Screenshot" /></p>
<p>Not everything is accurate, but it only needed to look right from one angle (or, technically two) in a badly lit room 🤷♀️</p>
<p>##No room for 2</p>
<p>With the tight size limits of 4KB and 30s rendering time, I didn’t want the extra code and slower rendering 2 characters required, so I simply mirrored and flipped one:</p>
<p><img src="/images/kept_in_a_box/mirroring.gif" alt="Screenshot" /></p>
<h2 id="hands">Hands</h2>
<p>The characters are holding hands, representing support for eachother. But doing this with a single, mirrored character is both complex (since the mirror would have to somehow be between the hands) and limits the hand positions (I don’t want them shaking hands).</p>
<p>So, I decided to model the hands seperately:</p>
<p><img src="/images/kept_in_a_box/hands.gif" alt="Screenshot" /></p>
<p>Again, they’re built from blended boxes, and the final step is to blend the hands into the character models.</p>
<p><img src="/images/kept_in_a_box/hands_and_arms.jpg" alt="Screenshot" /></p>
<p>I’m not super happy with the modelling there, but it’s difficult and looks ok in the final image I think.</p>
<h2 id="post">Post</h2>
<p>The raw pathtraced output looks like this (live preview quality, the final output is much less noisy):</p>
<p><img src="/images/kept_in_a_box/post1.jpg" alt="Screenshot" /></p>
<p>First I reduce brightness:</p>
<p><img src="/images/kept_in_a_box/post2.jpg" alt="Screenshot" /></p>
<p>Then increase contrast:</p>
<p><img src="/images/kept_in_a_box/post3.jpg" alt="Screenshot" /></p>
<p>I wanted more depth to the shadows, so I increased the ‘contrast bias’. This lets me adjust the midpoint about which contrast is changed, the code is like this:</p>
<p><code class="language-plaintext highlighter-rouge">luma = (luma - contrastBias) * contrast + contrastBias;</code></p>
<p><img src="/images/kept_in_a_box/post4.jpg" alt="Screenshot" /></p>
<p>Next is colour. I wanted richer colours, but without oversaturating. The first step is actually to desaturate:</p>
<p><img src="/images/kept_in_a_box/post5.jpg" alt="Screenshot" /></p>
<p>Then I increase ‘warmth’ by applying a power curve to the chroma:</p>
<p><img src="/images/kept_in_a_box/post6.jpg" alt="Screenshot" /></p>
<p>Just a touch of depth of field…</p>
<p><img src="/images/kept_in_a_box/post7.jpg" alt="Screenshot" /></p>
<p>Finally, I added quite a hard vignette to cut the light beam and give the image better composition:</p>
<p><img src="/images/kept_in_a_box/post8.jpg" alt="Screenshot" /></p>]]></content><author><name></name></author><category term="demoscene" /><category term="metal" /><category term="pathtracing" /><summary type="html"><![CDATA[Released at Hogmanay 2020, 4KB Executable Graphics (a single 4KB executable file generates this image, with no external data). Download on Demozoo]]></summary></entry><entry><title type="html">Backlight (4KB Executable Graphics)</title><link href="https://alia-traces.github.io/demoscene/metal/pathtracing/2020/10/03/backlight.html" rel="alternate" type="text/html" title="Backlight (4KB Executable Graphics)" /><published>2020-10-03T11:46:20+01:00</published><updated>2020-10-03T11:46:20+01:00</updated><id>https://alia-traces.github.io/demoscene/metal/pathtracing/2020/10/03/backlight</id><content type="html" xml:base="https://alia-traces.github.io/demoscene/metal/pathtracing/2020/10/03/backlight.html"><![CDATA[<p><img src="/images/backlight/Screenshot.jpg" alt="Screenshot" /></p>
<p>Released at Inercia 2020, 4KB Executable Graphics (a single 4KB executable file generates this image, with no external data).</p>
<p><a href="https://demozoo.org/graphics/284904/">Download on Demozoo</a>
<!--more--></p>
<h3 id="when-you-forget-which-day-the-dealine-is">When you forget which day the dealine is…</h3>
<p>This isn’t what I started out making. I started something entirely different, but it’s been a busy month and I had little time to work on it. Then somebody mentioned the deadline… on the day of the deadline 😬</p>
<p>There wasn’t enought time to finish it, so I decided to re-work an older piece. The Church of the Spinning Cube logo felt like a good place to start, so I played around with the lighting and added some glass.</p>
<h3 id="the-one-slightly-interesting-bit-of-new-tech">The one slightly interesting bit of new ‘tech’</h3>
<p>Well, more of a composition aid, but new to me. I’d set my scene up with nice lighting and interesting caustics, but when I positioned the camera I had this:</p>
<p><img src="/images/backlight/bl-1.jpg" alt="Screenshot" /></p>
<p>It’s ok, but the marbles are blocking the view of the logo. Obvious solution: move the camera up:</p>
<p><img src="/images/backlight/bl-2.jpg" alt="Screenshot" /></p>
<p>Now we have a good view of the logo, and a lot of the wall behind because the camera’s too high, and the nice caustics are off screen. Next obvious fix: angle the camera down:</p>
<p><img src="/images/backlight/bl-3.jpg" alt="Screenshot" /></p>
<p>The view is fine now, but now the logo has perspective, and it looks better when it’s parallel to the camera. Tilt the wall and logo? Redesign the whole scene?!</p>
<p>No, there’s an easy fix! In photography, in this situation I would use a <em>wider lens</em>. Then I’d take the shot parallel to the logo, from higher up. The composition would be bad, because you’d get even more wall, <strong>but</strong> the part of the view I want would be in frame, <em>and I can crop it later</em>.</p>
<p>It turns out cropping isn’t even necessary! We can just <em>pan the camera view</em>.</p>
<p>I took this view:</p>
<p><img src="/images/backlight/bl-2.jpg" alt="Screenshot" /></p>
<p>Then I just offset the uv coordinates I use to setup the camera:</p>
<p><code class="language-plaintext highlighter-rouge">uv -= 0.6;</code></p>
<p>That’s it! Just one tiny line of code shifts the view down while preserving the composition. Exactly like taking a bigger view with a wider lens and cropping, only easier.</p>
<p><img src="/images/backlight/bl-4.jpg" alt="Screenshot" /></p>]]></content><author><name></name></author><category term="demoscene" /><category term="metal" /><category term="pathtracing" /><summary type="html"><![CDATA[Released at Inercia 2020, 4KB Executable Graphics (a single 4KB executable file generates this image, with no external data). Download on Demozoo]]></summary></entry><entry><title type="html">Organising your workflow with tab groups in Xcode 12</title><link href="https://alia-traces.github.io/xcode/ios/mac/tools/2020/08/29/xcode-12-tabs.html" rel="alternate" type="text/html" title="Organising your workflow with tab groups in Xcode 12" /><published>2020-08-29T06:00:00+01:00</published><updated>2020-08-29T06:00:00+01:00</updated><id>https://alia-traces.github.io/xcode/ios/mac/tools/2020/08/29/xcode-12-tabs</id><content type="html" xml:base="https://alia-traces.github.io/xcode/ios/mac/tools/2020/08/29/xcode-12-tabs.html"><![CDATA[<p><img src="/images/xcode_tabs/hierarchy.png" alt="Two levels of tab bar for tab grouping" /></p>
<p>Xcode 12 has an all new tab system. Here’s how I use it to organise my workflow!
<!--more--></p>
<h2 id="window-tabs-document-tabs-tabs-vs-spaces">Window tabs, document tabs, tabs vs. spaces…</h2>
<p>This really confused me at first. There are now 2 different types of tab: window and document. These appear on 2 seperate tab bars, one under the other… it looks weird but it’s <em>great</em> for organising your workflow!</p>
<p>Here’s how I use it.</p>
<h3 id="window-tabs">Window tabs</h3>
<p>At the top, I organise my work by topic using window tabs:</p>
<p><img src="/images/xcode_tabs/window_tabs.png" alt="Window tabs" /></p>
<p>It makes more sense to think of window tabs as folders, and under each folder you can keep a bunch of documents open. I tend to split things up something like this:</p>
<ul>
<li><strong>Project</strong> for the project settings, info.plist, etc.</li>
<li><strong>UI</strong> for interface related files</li>
<li>Code files for whatever I’m working on currently</li>
<li><strong>Debug</strong> for debugging so the debugger doesn’t mess with my active editor</li>
</ul>
<p>Of course you can use whatever setup suits you.</p>
<h3 id="document-tabs">Document tabs</h3>
<p><img src="/images/xcode_tabs/document_tabs.png" alt="Document tabs" /></p>
<p>Under the window tabs, I use document tabs to keep all files I’m working on open. This lets me switch quickly between files, and the hierarchy of Window and Document tabs keeps things organised, so I can have 50 tabs open but still be able to find what I’m after quickly.</p>
<h2 id="how-to-set-it-up">How to set it up</h2>
<h3 id="window-tabs-1">Window tabs</h3>
<ol>
<li>
<p><strong>Turn on the Window Tab Bar</strong> from the View menu, if it’s not already visible:
<img src="/images/xcode_tabs/tab_bar_menu.jpg" alt="Turn on the window tab bar" /></p>
</li>
<li>
<p><strong>Add a few tabs</strong> with the + button at the end of the window tab bar, as many as you need.</p>
</li>
<li>
<p><strong>Rename the tabs</strong>. You can use the Window menu (Rename Window Tab) or press ⌥⇧⌘t.</p>
</li>
</ol>
<h3 id="document-tabs-1">Document tabs</h3>
<p>There’s an important thing to know when you open a file:</p>
<ul>
<li><strong>Single-clicking</strong> a file will open it <em>temporarily</em>. If you click on a different file, it’ll open it in the same tab.</li>
<li><strong>Double-clicking</strong> a file makes it <em>permanent</em>. If you click on another file, it’ll open in a new tab, keeping the first document open.</li>
</ul>
<p>This is really handy: double click to open files you’re actively working on, and you’ll quickly have a useful set of tabs. When you just need a file open briefly and won’t come back to it, single click.</p>
<p>You can change this in the prefs, under Navigation.</p>
<h2 id="a-little-debug-tab-magic">A little Debug tab magic</h2>
<p>It’s frustrating to run your code, then hit a breakpoint or crash and have Xcode switch to another file and start the debugger, causing you to lose your place and your window layout.</p>
<p>You can fix this! Use the Behaviours preferences to tell Xcode to switch to the Debug tab when it pauses, or when the build fails:</p>
<p><img src="/images/xcode_tabs/debug_behaviours.png" alt="Configuring debug behaviour" /></p>
<p>You can set this up for failed tests, frame capture etc. when you want it to switch to the Debug tab too.</p>]]></content><author><name></name></author><category term="xcode" /><category term="ios" /><category term="mac" /><category term="tools" /><summary type="html"><![CDATA[Xcode 12 has an all new tab system. Here’s how I use it to organise my workflow!]]></summary></entry><entry><title type="html">Reconstruction (4KB Executable Graphics)</title><link href="https://alia-traces.github.io/demoscene/metal/pathtracing/2020/07/24/reconstruction.html" rel="alternate" type="text/html" title="Reconstruction (4KB Executable Graphics)" /><published>2020-07-24T11:46:20+01:00</published><updated>2020-07-24T11:46:20+01:00</updated><id>https://alia-traces.github.io/demoscene/metal/pathtracing/2020/07/24/reconstruction</id><content type="html" xml:base="https://alia-traces.github.io/demoscene/metal/pathtracing/2020/07/24/reconstruction.html"><![CDATA[<p><img src="/images/reconstruction/Screenshot.jpg" alt="Screenshot" /></p>
<p>Released at FieldFX 2020, 4KB Executable Graphics (a single 4KB executable file generates this image, with no external data).</p>
<p><a href="https://demozoo.org/graphics/281254/">Download on Demozoo</a>
<!--more--></p>
<h3 id="glass-rendering">Glass rendering</h3>
<p>I’ve always loved the way glass refracts and reflects, and rendering it is sooooo much fun.</p>
<p>So, lots of glass. I havent done any comparisons, but my path tracer should produce pretty realistic results - it handles fresnel reflections, refraction, internal relection etc. There are a few sphere lights around the scene to produce caustics.</p>
<p>My last few pieces have been a bit heavy, resulting in a noisy image. I wanted to make something fast this time, so I could push quality in other ways, so instead of an SDF / raymarching setup I went with pure analytic intersections (i.e. calculate which object the ray will hit and jump directly there, instead of marching into the scene in a loop).</p>
<p>The downside: you’re stuck with simple geometry (or triangle based meshes - but that has its own set of problems).</p>
<p>So, I decided to stick with spheres, and for interest slice them up.</p>
<h3 id="meaning">Meaning</h3>
<p>Having some meaning in a piece is important to me, and with only slices of spheres to play with it needed to be abstract.</p>
<p>I went with a transgender theme again (indicated by the colours - the stripes in the marble form a trans flag).</p>
<p>On the left, an initial pile of glass - interesting, but wrong. In the middle, the true shape is understood, and the order has been established. Finally, the parts come together in a cohesive whole, and a perfect sphere is formed.</p>
<p>This reflects the transformation process the identity goes through in some transgender people: recognising that the shape of the self is wrong, figuring out the true identity, and reconstruction in the true form.</p>
<h3 id="breakdown">Breakdown</h3>
<p>Starting with the stage and lighting, I initially added a floor, a back wall (which is glossy so there’s some reflected light) and a couple of lights. This is a zoomed out view, with the floor set to checkerboard to make it easier to see what’s going on:</p>
<p><img src="/images/reconstruction/Lighting.jpg" alt="Screenshot" /></p>
<p>Depth of Field is actually turned right down, but I have another camera setting that makes the edges of the image out of focus - with a wider lens it becomes much more obvious than in the final image.</p>
<p>Zooming back in, the basic layout plan was to have 3 spheres:</p>
<p><img src="/images/reconstruction/Layout.jpg" alt="Screenshot" /></p>
<p>Finally I took each sphere and split it into 5 slices, and moved the pieces about a bit.</p>
<p>This is the raw output from the renderer:</p>
<p><img src="/images/reconstruction/post1.jpg" alt="Screenshot" /></p>
<p>The final stage is colour grading, because I got my grading code working for this one :D I only used about 30% of the controls in the end, since it didn’t need much else.</p>
<p>My process for colour grading is to switch to YUV colour space (a single mat3 multiply), then work on luma and chroma seperately, then back to RGB.</p>
<p>The first step is to increase saturation:</p>
<p><img src="/images/reconstruction/post_saturation.jpg" alt="Screenshot" /></p>
<p>I applied a cyan tint to remove the pink tint in the image. This lets the light colour shine through (see how the wall is more pink and the floor is more blue - the ambient colour is set to blue sky, red horizon):</p>
<p><img src="/images/reconstruction/post_main_tint.jpg" alt="Screenshot" /></p>
<p>I shifted the tint further into blue/green in the shadows, which looks like it’s gone too far:</p>
<p><img src="/images/reconstruction/post_shadow_tint.jpg" alt="Screenshot" /></p>
<p>But when the highlights are shifted more towards orange it balances that and gives a nice tonal range across the whole image:</p>
<p><img src="/images/reconstruction/post_highlight_tint.jpg" alt="Screenshot" /></p>
<p>The last thing I want to mention is the important of lots of light bounces with glass. Because there are so many potential paths through the surface (reflection or refraction at the surface, reflection or refraction at the exit, paths that hit multiple glass surfaces, and paths that take many internal reflections…) it’s essential to allow a large number of bounces per ray.</p>
<p>I settled on 25 in the end, here’s a video showing what happens as the bounce count goes from 2-25:</p>
<div>
<video controls="" loop="" style="min-width: 100%;">
<source type="video/mp4" src="/images/reconstruction/light_bounces.mp4" />
<source type="video/webm" src="/images/reconstruction/eclipse.webm" />
</video>
</div>]]></content><author><name></name></author><category term="demoscene" /><category term="metal" /><category term="pathtracing" /><summary type="html"><![CDATA[Released at FieldFX 2020, 4KB Executable Graphics (a single 4KB executable file generates this image, with no external data). Download on Demozoo]]></summary></entry><entry><title type="html">How to use Metal frame capture outside of Xcode</title><link href="https://alia-traces.github.io/metal/tools/xcode/2020/07/18/adding-framecapture-outside-of-xcode.html" rel="alternate" type="text/html" title="How to use Metal frame capture outside of Xcode" /><published>2020-07-18T11:46:20+01:00</published><updated>2020-07-18T11:46:20+01:00</updated><id>https://alia-traces.github.io/metal/tools/xcode/2020/07/18/adding-framecapture-outside-of-xcode</id><content type="html" xml:base="https://alia-traces.github.io/metal/tools/xcode/2020/07/18/adding-framecapture-outside-of-xcode.html"><![CDATA[<p>Some time ago I wrote my own live shader editor for Metal shaders - MetalToy, named after ShaderToy, since I wanted something like shadertoy for Metal. It’s basic but very useful - I can edit shaders in any editor (I use Xcode), hit cmd-s to save and MetalToy will update the live preview.</p>
<p><img src="/images/MetalToyUI.jpg" alt="MetalToy's user interface" /><br /><em>MetalToy looks like this</em>
<!--more--></p>
<p>At some point I’ll clean it up and open source it, but here’s something that might be of use to other mac / iOS devs. Yesterday I was trying to fix a bug in a shader, and I thought</p>
<h3 id="itd-be-so-good-to-have-a-debug-button-that-fires-up-a-full-shader-debugger"><em>“It’d be so good to have a ‘Debug’ button that fires up a full shader debugger”</em></h3>
<p>It took around 20 minutes to do it :D</p>
<p><img src="/images/MetalToyDebugButton.jpg" alt="MetalToy with a shiny new debug button" /><br /><em>One shiny new Debug button</em></p>
<h2 id="dont-build-your-own-gpu-debugger">Don’t build your own GPU debugger</h2>
<p>Unless you really want to I guess? Xcode already has some amazing GPU debugging and profiling tools built in, and you can simply hit the frame capture button while debugging your app. This only works if you’re debugging it though.</p>
<p>What I needed was a way to run frame capture outside of Xcode. Turns out Apple anticipated this nicely!</p>
<p>For full details, you can see <a href="https://developer.apple.com/documentation/metal/frame_capture_debugging_tools/enabling_frame_capture">Apple’s documentation here</a>. Since their docs have a habit of, uh, moving, disappearing or not being updated, I’ll document the process here:</p>
<ol>
<li>In your project’s info.plist add the key “MetalCaptureEnabled”, set it to true</li>
<li>Add code to capture GPU commands. See <a href="https://developer.apple.com/documentation/metal/frame_capture_debugging_tools/capturing_gpu_command_data_programmatically#3325255">Apple’s docs on writing frame captures to a file here</a></li>
</ol>
<p>In my case I added a ‘Debug’ button. This pauses my MTKView (which just makes things simpler), starts capturing, calls view.draw() and ends. It save a .gputrace (frame capture) file to disk, then asks Xcode to open the file.</p>
<p>That means I can just hit the Debug button at any time in MetalToy and it fires up the full Xcode GPU debugging tools :D</p>
<p><img src="/images/MetalToyGPUDebug.jpg" alt="The shader in Xcode's GPU debugger" /><br /><em>A click of a button, and we have a full shader debugger!</em></p>
<h2 id="very-useful-for-remote-debugging">VERY useful for remote debugging</h2>
<p>You could also just write the file out and send it to your own server, or ask the customer to email it.</p>
<p>Have a customer who’s seeing an unusual bug you can’t reproduce? Now they can just hit a button, and it’ll send you a frame capture so you can reproduce it instantly.</p>
<h2 id="the-code">The Code</h2>
<p>Here’s what I did - the Debug button just calls this function:</p>
<figure class="highlight"><pre><code class="language-swift" data-lang="swift"><span class="kd">func</span> <span class="nf">captureFrame</span><span class="p">()</span> <span class="p">{</span>
<span class="c1">// Stop playback if playing</span>
<span class="n">preview</span><span class="o">.</span><span class="n">play</span> <span class="o">=</span> <span class="kc">false</span>
<span class="k">let</span> <span class="nv">captureManager</span> <span class="o">=</span> <span class="kt">MTLCaptureManager</span><span class="o">.</span><span class="nf">shared</span><span class="p">()</span>
<span class="k">guard</span> <span class="n">captureManager</span><span class="o">.</span><span class="nf">supportsDestination</span><span class="p">(</span><span class="o">.</span><span class="n">gpuTraceDocument</span><span class="p">)</span> <span class="k">else</span> <span class="p">{</span>
<span class="nf">print</span><span class="p">(</span><span class="s">"Capture to a GPU tracefile is not supported"</span><span class="p">)</span>
<span class="k">return</span>
<span class="p">}</span>
<span class="c1">// Write file to tmp folder</span>
<span class="k">let</span> <span class="nv">tmpDir</span> <span class="o">=</span> <span class="kt">FileManager</span><span class="o">.</span><span class="k">default</span><span class="o">.</span><span class="n">temporaryDirectory</span>
<span class="k">let</span> <span class="nv">destURL</span> <span class="o">=</span> <span class="n">tmpDir</span><span class="o">.</span><span class="nf">appendingPathComponent</span><span class="p">(</span><span class="s">"frameCapture.gputrace"</span><span class="p">)</span>
<span class="c1">// Set up the capture destiptor</span>
<span class="k">let</span> <span class="nv">captureDescriptor</span> <span class="o">=</span> <span class="kt">MTLCaptureDescriptor</span><span class="p">()</span>
<span class="n">captureDescriptor</span><span class="o">.</span><span class="n">captureObject</span> <span class="o">=</span> <span class="n">mtlDev</span>
<span class="n">captureDescriptor</span><span class="o">.</span><span class="n">destination</span> <span class="o">=</span> <span class="o">.</span><span class="n">gpuTraceDocument</span>
<span class="n">captureDescriptor</span><span class="o">.</span><span class="n">outputURL</span> <span class="o">=</span> <span class="n">destURL</span>
<span class="k">do</span> <span class="p">{</span>
<span class="k">try</span> <span class="n">captureManager</span><span class="o">.</span><span class="nf">startCapture</span><span class="p">(</span><span class="nv">with</span><span class="p">:</span> <span class="n">captureDescriptor</span><span class="p">)</span>
<span class="p">}</span> <span class="k">catch</span> <span class="k">let</span> <span class="nv">e</span> <span class="p">{</span>
<span class="nf">print</span><span class="p">(</span><span class="s">"Failed to capture frame for debug: </span><span class="se">\(</span><span class="n">e</span><span class="o">.</span><span class="n">localizedDescription</span><span class="se">)</span><span class="s">"</span><span class="p">)</span>
<span class="k">return</span>
<span class="p">}</span>
<span class="c1">// Draw a frame to capture it</span>
<span class="n">preview</span><span class="o">.</span><span class="nf">draw</span><span class="p">()</span>
<span class="n">captureManager</span><span class="o">.</span><span class="nf">stopCapture</span><span class="p">()</span>
<span class="c1">// Open the file in xcode</span>
<span class="k">let</span> <span class="nv">standardPath</span> <span class="o">=</span> <span class="n">destURL</span><span class="o">.</span><span class="n">path</span>
<span class="k">let</span> <span class="nv">scriptSource</span> <span class="o">=</span> <span class="s">"tell application </span><span class="se">\"</span><span class="s">Xcode</span><span class="se">\"\n</span><span class="s">open </span><span class="se">\"\(</span><span class="n">standardPath</span><span class="se">)\"\n</span><span class="s">end tell"</span>
<span class="k">let</span> <span class="nv">script</span> <span class="o">=</span> <span class="kt">NSAppleScript</span><span class="o">.</span><span class="nf">init</span><span class="p">(</span><span class="nv">source</span><span class="p">:</span> <span class="n">scriptSource</span><span class="p">)</span>
<span class="kt">DispatchQueue</span><span class="p">(</span><span class="nv">label</span><span class="p">:</span> <span class="s">"XcodeOpen"</span><span class="p">,</span> <span class="nv">qos</span><span class="p">:</span> <span class="o">.</span><span class="n">utility</span><span class="p">,</span> <span class="nv">attributes</span><span class="p">:</span> <span class="p">[</span><span class="o">.</span><span class="n">concurrent</span><span class="p">],</span> <span class="nv">autoreleaseFrequency</span><span class="p">:</span> <span class="o">.</span><span class="n">inherit</span><span class="p">,</span> <span class="nv">target</span><span class="p">:</span> <span class="kc">nil</span><span class="p">)</span><span class="o">.</span><span class="k">async</span> <span class="p">{</span>
<span class="n">script</span><span class="p">?</span><span class="o">.</span><span class="nf">executeAndReturnError</span><span class="p">(</span><span class="kc">nil</span><span class="p">)</span>
<span class="p">}</span>
<span class="p">}</span></code></pre></figure>]]></content><author><name></name></author><category term="metal" /><category term="tools" /><category term="xcode" /><summary type="html"><![CDATA[Some time ago I wrote my own live shader editor for Metal shaders - MetalToy, named after ShaderToy, since I wanted something like shadertoy for Metal. It’s basic but very useful - I can edit shaders in any editor (I use Xcode), hit cmd-s to save and MetalToy will update the live preview. MetalToy looks like this]]></summary></entry><entry><title type="html">End of an Era (4KB Executable Graphics)</title><link href="https://alia-traces.github.io/demoscene/metal/pathtracing/2020/07/11/end-of-an-era.html" rel="alternate" type="text/html" title="End of an Era (4KB Executable Graphics)" /><published>2020-07-11T11:46:20+01:00</published><updated>2020-07-11T11:46:20+01:00</updated><id>https://alia-traces.github.io/demoscene/metal/pathtracing/2020/07/11/end-of-an-era</id><content type="html" xml:base="https://alia-traces.github.io/demoscene/metal/pathtracing/2020/07/11/end-of-an-era.html"><![CDATA[<p><img src="/images/end_of_an_era/Screenshot.jpg" alt="Screenshot" /></p>
<p>Released at Solskogen 2020, 4KB Executable Graphics (a single 4KB executable file generates this image, with no external data).</p>
<p><a href="https://demozoo.org/graphics/280517/">Download on Demozoo</a>
<!--more--></p>
<h3 id="where-leaves-fall-in-autumn-flowers-bloom-in-spring"><em>Where leaves fall in Autumn, flowers bloom in Spring</em></h3>
<p>Solskogen 2020 was the last ever Solskogen party, and I wanted to make something fitting.</p>
<p>It was the end of an era, and a time of sadness as well as a really good party. Sunset and Autumn came to mind, and Autumn speaks to me of falling leaves the colour of flames and golden light.</p>
<p>So, a leaf strewn forest floor, dappled golden sunlight. The final element was the ring - discarded and left in the woods, a symbolic closing gesture.</p>
<h2 id="breakdown">Breakdown</h2>
<p>Again, I started with the lighting. The ambient light has a green tint, and the main light is quite orange to hint at sunset.</p>
<p>I placed 2 planes between the scene and the light with holes in to create a dappled light effect (you can see the hole pattern, but with the leaves in places it’s not visible so wasn’t worth changing).</p>
<p><img src="/images/end_of_an_era/1.jpg" alt="Screenshot" /></p>
<p>The ring is just a stretched torus.</p>
<p><img src="/images/end_of_an_era/2.jpg" alt="Screenshot" /></p>
<p>Next, leaves. I used SDFs for this, with domain repetition to basically make a grid of leaves. Size, shape, colour and rotation are randomised, and they’re given a slight curl.</p>
<p><img src="/images/end_of_an_era/3.jpg" alt="Screenshot" /></p>
<p>I then added more layers, with a random rotation and offset, to build the leaf pile.</p>
<p>Easy, but: the leaves all intersect.</p>
<p><img src="/images/end_of_an_era/4.jpg" alt="Screenshot" /></p>
<p>I had to figure out how to pile leaves up without them overlapping. A physics sim wasn’t viable. Doing this analytically is way beyond my maths level. Time for dirty hacks :D</p>
<p>Compare this shot with the last one - the leaves are in exactly the same position, but they magically no longer intersect:</p>
<p><img src="/images/end_of_an_era/5.jpg" alt="Screenshot" /></p>
<p>How? I built the leaf pile in layers, bottom to top. For each layer, I take the shape of the leaf and use it to clip any geometry above it. This means if there’s a lower leaf intersecting the current one, the part that intersects gets removed. Cheap, hacky, but looks ok 🙃</p>]]></content><author><name></name></author><category term="demoscene" /><category term="metal" /><category term="pathtracing" /><summary type="html"><![CDATA[Released at Solskogen 2020, 4KB Executable Graphics (a single 4KB executable file generates this image, with no external data). Download on Demozoo]]></summary></entry><entry><title type="html">Cloth!!! (4KB Executable Graphics)</title><link href="https://alia-traces.github.io/demoscene/metal/pathtracing/2020/06/20/cloth.html" rel="alternate" type="text/html" title="Cloth!!! (4KB Executable Graphics)" /><published>2020-06-20T11:46:20+01:00</published><updated>2020-06-20T11:46:20+01:00</updated><id>https://alia-traces.github.io/demoscene/metal/pathtracing/2020/06/20/cloth</id><content type="html" xml:base="https://alia-traces.github.io/demoscene/metal/pathtracing/2020/06/20/cloth.html"><![CDATA[<p><img src="/images/cloth/screenshot.jpg" alt="Undefined Symbol screenshot" /></p>
<p>Released at Nova 2020, 4KB Executable Graphics (a single 4KB executable file generates this image, with no external data) with help from Lia-Sae.</p>
<p><a href="https://demozoo.org/graphics/279420/">Download on Demozoo</a>
<!--more--></p>
<h2 id="the-challenge-rendering-realistic-cloth">The challenge: Rendering realistic cloth</h2>
<p>I was thinking a lot about materials before I started this, particularly how texture isn’t really a thing when you look close enough: there’s only geometry and material. Texture is how we perceive fine surface details.</p>
<p>It follows, then, that realistic rendering should use ultra-detailed geometry, not textures.</p>
<p>I decided to try it, by rendering cloth 🤦♀️</p>
<h2 id="breakdown">Breakdown</h2>
<p>To give the scene some scale, I had to show cloth up close. That ended up being difficult - how to show it nicely and give it scale? I decided to make tiny cloth flags and carpet, and use Lego to give it a sense of scale.</p>
<p>I started with the lighting. The ambient light has a blue tint and the main light yellow to indicate daytime / sunlight. To provide a visual cue that we’re indoors I added some shadows across the scene, perhaps the wooden frame between window panes.</p>
<p><img src="/images/cloth/lighting.jpg" alt="Lighting setup" /></p>
<p>Getting the lighting right is important - without the main (sun) light, the scene looks drab and flat.</p>
<p><img src="/images/cloth/no_light.jpg" alt="Lighting setup" /></p>
<p>Apart from that the scene is quite simple - carpet, 2 flags, some lego blocks.</p>
<h2 id="cloth-rendering">Cloth rendering</h2>
<p>Here’s how the cloth works:</p>
<p><img src="/images/cloth/zoom1.jpg" alt="Zoomed cloth" /></p>
<p><img src="/images/cloth/zoom2.jpg" alt="Zoomed cloth" /></p>
<p><img src="/images/cloth/zoom3.jpg" alt="Zoomed cloth" /></p>
<p><img src="/images/cloth/zoom4.jpg" alt="Zoomed cloth" /></p>
<p>It’s literally just modelled down to fibre level 🙃 With path tracing this is actually viable - with many samples per pixel aliasing is not an issue, and because light bounces around within the threads the lighting and colour is realistic.</p>
<p>Generating this as a mesh clearly isn’t viable, so this is rendered as a raymarched SDF (signed distance field). In fact only 2 single fibres are drawn - they’re repeated, twisted around themselves to form threads, mirrored and twisted again to form ‘string’ (Lia-Sae will probably tell me that’s the wrong term ;) then repeated and woven together to form cloth.</p>
<p>This is how it works in cross-section:</p>
<iframe width="640" height="360" frameborder="0" src="https://www.shadertoy.com/embed/3dSBRK?gui=true&t=10&paused=true&muted=false" allowfullscreen=""></iframe>
<h3 id="the-importance-of-light-bounces">The importance of light bounces</h3>
<p>Each time the ray bounces, it transmits colour from the surface. This means if a ray bounces multiple times within the cloth, it both increases the brightness of the cloth and also the richness of the colour.</p>
<p>This turns out to be critical:</p>
<p><img src="/images/cloth/bounces1.jpg" alt="Effect of light bounces" />
<br /><em>5 bounces per sample</em></p>
<p><img src="/images/cloth/bounces2.jpg" alt="Effect of light bounces" />
<br /><em>15 bounces per sample</em></p>
<h2 id="the-downside">The downside</h2>
<p>Of course there’s a catch: rendering geometry this fine is <em>expensive</em>. I optimised this to hell, and it’s still very slow - especially when many light bounces are needed. I think in the end the release version only manages ~12 samples per pixel on my GPU (a Vega 56), which is why it’s a bit noisy…</p>]]></content><author><name></name></author><category term="demoscene" /><category term="metal" /><category term="pathtracing" /><summary type="html"><![CDATA[Released at Nova 2020, 4KB Executable Graphics (a single 4KB executable file generates this image, with no external data) with help from Lia-Sae. Download on Demozoo]]></summary></entry><entry><title type="html">Ice Core (4KB Executable Graphics)</title><link href="https://alia-traces.github.io/demoscene/metal/pathtracing/2020/05/22/ice-core.html" rel="alternate" type="text/html" title="Ice Core (4KB Executable Graphics)" /><published>2020-05-22T11:46:20+01:00</published><updated>2020-05-22T11:46:20+01:00</updated><id>https://alia-traces.github.io/demoscene/metal/pathtracing/2020/05/22/ice-core</id><content type="html" xml:base="https://alia-traces.github.io/demoscene/metal/pathtracing/2020/05/22/ice-core.html"><![CDATA[<p><img src="/images/ice_core/ice_core_screenshot.jpg" alt="Ice Core screenshot" /></p>
<p>Released at Outline 2020, 4KB Executable Graphics (a single 4KB executable file generates this image, with no external data).</p>
<p><a href="https://demozoo.org/graphics/278502/">Download on Demozoo</a>
<!--more--></p>
<h2 id="2nd-demoscene-release">2nd demoscene release</h2>
<p>I wanted to do something for Outline, but didn’t have a huge amount of time. I had a WIP shader of an ice core to hand, so I quickly put this together. It came last in the competition, but I liked it :)</p>
<h2 id="tech">Tech</h2>
<p>There’s nothing particularly new here - it’s path traced glass again, but this time with the lighting set up to give nice caustics.</p>
<p>The ice core is just a glass cylinder sliced into sections. Each section has a random colour and roughness value to produce the varied ice layers.</p>
<p>There are also bubbles and particles within the ice. The geometry is an SDF, and the one nice trick I came up with here is the dark particles - I used <code class="language-plaintext highlighter-rouge">p = fract(p);</code> to repeat space, then placed a sphere at a random position in each cell.</p>
<p>Normally it’s important to keep the sphere inside the cell to avoid artefacts, but in this case such artefacts produce dark spots in the ice - which look like floating particles ;)</p>]]></content><author><name></name></author><category term="demoscene" /><category term="metal" /><category term="pathtracing" /><summary type="html"><![CDATA[Released at Outline 2020, 4KB Executable Graphics (a single 4KB executable file generates this image, with no external data). Download on Demozoo]]></summary></entry></feed>