Hacker Newsnew | past | comments | ask | show | jobs | submit | speleo's commentslogin

The Planck length (and other units) are scale factors, not necessarily canonical quantizations of physics. Planck units represent a transition into a scale where the Standard Model, QFT, and other models no longer accurately predict behavior of the system.


This was just remastered in 4K from the original footage and released by Cyan Worlds.


Interesting, I'd love to see a link to that if you know of it. Here's the original paper: https://aclanthology.org/2023.findings-acl.426.pdf In my own work I've successfully classified emergent behavior in Cellular Automata using a similar technique, and the technique has also been used elsewhere with success: https://www.nature.com/articles/s41598-022-12826-w


This took me an unreasonable amount of time to find, but here it is

https://kenschutte.com/gzip-knn-paper2/

The moral: the methodology is cool, but implementation details matter, i guess...


Thank you for this, I appreciate it! That's unfortunate to hear. I may have to swap out the example I used in this article, and maybe also include a note that this technique has limitations. I think that using compression/Kolmogorov complexity metrics for classification is a fruitful endeavor and that the philosophy of groups like the Hutter Prize are sound, but the kNN + gzip example looks like it has some problems with it.

For anyone else following along, I think the GitHub Issue discussion on the paper's repo is really interesting: https://github.com/bazingagin/npc_gzip/issues/3


Here is a small web-app I made to explore the data (does not support mobile yet, but anything >= the size of a tablet screen should be fine): http://kylehovey.github.io/automata-nebula-explorer/index.ht...


I'm going to need sauce on that.


Basic method to generate this set if you want to play around with it:

    from itertools import chain
    import numpy as np
    import matplotlib
    matplotlib.use('TkAgg')
    from matplotlib import pyplot as plt

    def gen_pm_one(m):
        def array_for(n):
            return map(
                lambda i: 1 if not (1 << i) & n == 0 else -1,
                range(m)
            )

        def out():
            generated = 0
            final = 2 ** m

            while generated < final:
                yield array_for(generated)
                generated += 1

        return out()

    if __name__ == '__main__':
        roots = map(
            lambda roots: map(
                lambda cpx: [cpx.real, cpx.imag],
                roots
            ),
            map(
                np.roots,
                gen_pm_one(13)
            )
        )

        data = np.array(list(chain(*roots)))
        x, y = data.T
        plt.scatter(x,y)
        plt.show()


This is great! I'm so excited for the wave of geo-spacial data demos using Mapbox's amazing API and their turf.js computation library that accompanies it.


>For example, the fastest distance between two points is not a straight line, it's the cycloid, specifically the Brachistochrone curve [2]. This is the path light follows.

You may be conflating [Bernoulli's solution to the Brachistochrone curve](http://www.math.rug.nl/~broer/pdf/ws-ijbc.pdf) with the optimal path for light. [Fermat's Principle](https://en.wikipedia.org/wiki/Fermat%27s_principle) states that, when traveling between two points, light will always take the path that minimizes the time taken from the first point to the last. In a medium of constant refractive index (which includes free space), this results in a line.

>So if you're looking for ways to distribute partitions or encode invariants in your models, data or otherwise, the geometrical aspects of elliptical and cycloidal curves are a good place to explore.

How do you mean? What sort of data can be encoded this way and how?

> NB: Consider this, two seperate impulses of light beginning at different distances away from the observer, both impulses of light traveling along the optimal path at the optimal speed, and both arriving at the observer simultaneously, without bending time. And as shown above, on a cycloidal curve, this phenomenon is not unique to light.

Two separate impulses of light beginning at different distances away from a stationary observer will necessarily arrive at different times, otherwise you violate the basis of special relativity: the speed of light is constant and invariant of reference frame.


You can solve the brachistochrone problem using Fermat's principle, as shown in the 3blue1brown video GP is referring to. Since you're looking for the shortest time, if you can construct a lens where the speed of light is proportional to the speed of the bead on a wire, then the shortest-time wire is the path that light would take by Fermat's principle, and then you can use (an infinitesimal version of) Snell's law to find the direction of the wire at each height.


Yes, but they were saying:

>For example, the fastest distance between two points is not a straight line, it's the cycloid, specifically the Brachistochrone curve [2]. This is the path light follows.

Which is not true for free space, or any space with a constant index of refraction.


Oh yes, they do seem to be confused! espeed, if you happen to read this, the brachistochrone is the fastest path for something accelerated by a constant force (e.g. gravity near the surface of the earth).


Didn't Randal Munroe invent this technique?

https://xkcd.com/195/


Using Hilbert curves for accessing spatial data is way older, at least the early 1990s. (and arguably that xkcd works the other way round, mapping one-dimensional thing (IPs) to a 2D-image)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: