Upscaled the 50% by 200%, using bicubic in Paint.net
Upscaled the 50% by 200%, using nearest neighbor
4-6 have some problems of fitting on the template right, but this could be fixed if I felt like it. I also didnât realize that the neck was transparent, but thatâs an easy fix.
Stacked:
Original
Bicubic
Nearest Neighbor
I donât think that this would help too much. Of course it isnât a âperfectâ copy, but itâs close enough to be sold with the right doctoring up.
Problem is, if youâre authoring your original texture that way (nevermind itâs going to be 4x work), youâre going to make sure the texture looks good after being downsampled, because naturally you donât want the texture to look ugly.
Once it looks good, all the âthievesâ have to do is to upscale your texture prior to reuploading. Itâs going to look the same in-game, because in-game, weâre really comparing one downscaled version to another, not the upscaled copy to the original.
I was under the impression bots werenât the biggest problem. Itâs as easy as pie to manually copy clothing (-1 from the ID, screenshot a 585x559 ImageLabel with the clothing applied, go to roblox.com/asset?id=id.png/jpg/etc), and as evident with NBC shirt creators, people will just copy the clothing and upload it to their account to avoid having to purchase it, or rip off other peoplesâ clothing to make it appear as if they made it (i.e. so many Kestrel clothing copies being sold as âoriginalâ because the copy uploader wants to look cool)
I understand what you are saying about how even the originals will look fuzzy due to downsampling.
I donât buy your upsampling argument though. Information gets destroyed on the downsample to the client. How does it get added back? Entrophy only increases as you repeat the cycle, assuming you are doing some texture impolation instead of just 2x pixel resize.
Not really. As long as you keep the spatial frequency of the image lower than one half of your sampling frequency (Nyqist frequency), no information is lost.
The extreme example to it is constant color: you can upscale/downscale a single pixel to any dimension without any loss of information.
When user audio came out, it was insecure as in the asset was not protected, you could -1 the ID and save the audio file itself from Google Chrome or something by right clicking on it.
This has since been fixed, attempting to -1 a sound ID will just skip to a SolidModel ID or something not relevant, sound IDâs are secure.
I donât know why clothing never followed this suit, since they clearly have the ability to block the asset.
I think if you examine the 1D 1-channel case with reasonable assumptions (bi-cubic or lanczos resample), that is not true.
I also think your Nyquist claim is wrong. For instance the bitmap [0 0 0 1 1 1 0 0 0 1 1 1 0 0 0] has lower frequency that 2/x but will exhibit aliasing under continued upsampling and downsampling.
I second this, as I think that with the removal of tickets, we really need to get rid of price floors to make up for it.
Price floors have not stopped people from being able to steal other peopleâs assets. All they have done is made it so that you are forced to monetize assets, stolen or not stolen. This is no solution to the problem. Anyone with BC is still able to steal another personâs assets. Bots canât re-upload stolen shirts, simply because it has always required BC to upload shirts.
If you want to do something that has an actual effect, add a fee for uploading Shirts/Pants. As well as this, as mentioned in the OP, a system that allows the user to make a report, if they provide a link to a shirt that was made before the re-uploaded one, I think will be a great system which would require little moderation since it could mostly be automated. Automated warnings/bans could be put in place if the creator of an item sends a link of their original item, and they are significantly similar.
I think that it is desperately needed that the price floors be removed now that NBC players can no longer buy clothes (As they have no way of getting any income. Now, all we will see, are people in default colors and T-Shirts. Thatâs really lame and a big step downwards from the âITâS FREEEEâ roblox).
Yes, I actually used decals for the images that I posted above, for faster testing and I donât have BC.
Also, I was able to get the texture (and obj) files that are used for the 3d thumbnail, I donât think that it would be that easy to secure that system.
If you know the downsampling method, you simply find an higher resolution input that, when downsampled, exactly matches. The full resolution images donât need to match.
(More likely, people would just have the slightly degraded quality versions.)
Actually, your example has very high frequency: itâs the notorious step (Heavysideâs) function. And The fact that the bitmap exhibits aliasing during downsampling is exactly the result of its highest band exceeding the Nyquist limit. Thatâs the definition of aliasing.
With aliasing, in the frequency domain the portion of the signalâs band that sticks beyond the Nyquist frequency is mirrored back against it and superimposed onto the rest of the signal. That is unless you apply some anti-aliasing, usually in the form of a pre-filtering of some sort, to limit the signalâs band to be under the limit.
P.S. Btw, now you know why the most of the shader-based anti-aliasing techniques you see in games (FXAA, TSAA, xxAA) are not really anti-aliasing (i.e. suck): post-filtering of a downsampled signal is a bad substitute.
It is more like finding hash collisions. The end user only sees the downsampled version. I just need to find any input that results in the same downsampled version. To do this, I only need to know about the downsampling algorithm.
Basically, the full resolution images donât need to match and likely will not match exactly.
What if the downsampling algorithm destroys information? For instance, maybe it averages two numbers.
You guys make it sound like I can take a black and white checkerboard, upsample then downsample it a million times and I can expect to get something that isnât completely grey at the end of the process.
The original image already contained bonus information that was never useful. I donât need to duplicate this unused information at all as I only care about how my shirt will render on potentially buyerâs clients.
For your example the very first time you downsample it, you get the gray image. Your proposal says that the client will get this downsampled version, which is gray. If I want to copy your shirt, I only need to find any higher resolution image that also downsamples to a gray image. I donât need to find your specific version.
Ok. That make sense. Letâs say the downsample is InterpolationMode.HighQualityBicubic. I make a shirt and you want to steal it. What does your upsample algorithm do?