Show more

@Albright @Crell Vibe coding accelerates the transition from "greenfield project" to "the guy who built this no longer works here" from months to minutes!

@pavel @martijnbraam @linmob You would need a more rigorous setup to actually see that non-linearity, what's here is still more of an artifact from averaging bunch of pixels. When you take just a single one, it's below the noise floor.

@pavel @martijnbraam @linmob Same thing, but brighter.

While there will be some tiny non-linearity if you look really close (there are no perfect sensors after all), it's absolutely nowhere even close to 5% and not really something you have to concern yourself with for photography :P

@pavel @martijnbraam @linmob No, that's clipping. My light source was 5% too bright, so you can see that it stays linear up until it clips.

@pavel @martijnbraam @linmob When you take only a few pixels from the center (with G channel clipped at the very top):

@pavel @martijnbraam @linmob After fixing Megapixels' tool, the result is linear up until the image starts clipping (at around 0.6 in this case). Since that tool averages the values across the whole frame, what you're seeing above that is lens shading.

@pavel @martijnbraam @linmob I was looking at these results some time ago and they turned out to be nothing but clipped data and lack of data pedestal handling.

@pluszysta Tak się składa, że Piotra Szkudelskiego kilkukrotnie na bębnach zastępował Sławomir "Sowa" Puchała 🦉

@linmob @ati1 Millipixels does audio in a completely broken way that has no chance to ever sync correctly, but there should be no hissing or artifacts there (other than a short pop at the beginning of recording). Does that happen with audio recorders as well?

@devrtz There is one - you don't see it? Not very detailed, but ultimately it's just a phone video of a live gig like any other: blinking lights, people with instruments on stage and dark silhouettes of audience's heads at the bottom 😂

@aeva This one is as @glyph said - I'm not aware of an API for that. It smells like it could be a reasonable addition to propose, perhaps as part of xdg-output - which happens to be where you get all the info needed to always render at 1:1 density, by the way.

Meanwhile, you could ask the user to touch the screen first and lay your surfaces out based on that, I guess?

Also, there's no magic there. If it got fullscreened, something made it fullscreen and you'll be able to trace that.

@aeva @glyph You call set_fullscreen(output) on your surface and it goes fullscreen on selected output.

If you use some middleware to do that for you, then you need to spend some effort to find out how to do the equivalent there. You can use WAYLAND_DEBUG=client to inspect what calls it ends up making below your code.

@glyph @aeva No, this is the fullscreen *shell*, not something you use on desktops.

However, the regular xdg-shell's xdg_toplevel::set_fullscreen simply takes a wl_output as a parameter. It's not a Wayland's problem at all.

wayland.app/protocols/xdg-shel

Show more
Librem Social

Librem Social is an opt-in public network. Messages are shared under Creative Commons BY-SA 4.0 license terms. Policy.

Stay safe. Please abide by our code of conduct.

(Source code)

image/svg+xml Librem Chat image/svg+xml