FAQ about RoadToPetabyte

We backup websites, not tracking you!

 

On r/appledatahoarding and r/PetabyteLifestyle (previously r/RoadToPetabyte) we reported in January 2021 that different website backup services removed data from their services. We implement several backup project in the last months too, or better, we improved the workflow, because we already did similar backups in the past too. You can help us too, if you wish.

 

We all know that website owners can request a remotion of links and other from various platform, or that previous submitted data can get corrupted, so we decided to implement a new strategy, to keep the internet safe on WebArchive. 

 

From now every single link will contain

 

?rapplestyleroadtopetabytebackup

 

example

 

https://old.reddit.com/r/appledatahoarding/submit?rapplestyleroadtopetabytebackup

 

http://web.archive.org/web/*/https://old.reddit.com/r/appledatahoarding/submit?rapplestyleroadtopetabytebackup

 

https://old.reddit.com/r/appledatahoarding/comments/l3c5jt/we_adopt_a_new_backup_strategy/?rapplestyleroadtopetabytebackup

 

http://web.archive.org/web/*/https://old.reddit.com/r/appledatahoarding/comments/l3c5jt/we_adopt_a_new_backup_strategy/?rapplestyleroadtopetabytebackup

 

If you don't see the original content, try to search it with ?rapplestyleroadtopetabytebackup 

 

Sometimes websites add a = after the link, so in such case you should try to search with ?rapplestyleroadtopetabytebackup= too

 

Obviously we can adopt such strategy only to few links and we still backup the original link too. This means we already have 2 copies of the same data. In reality we backup much more copies, using various methods.

 

Since we use over 50+ Mac Browser, we can apply such strategy only when we use one of our 30+ chromium based browsers. Again, we don't use such strategy with all websites. If we visit a page that we think is worth to backup, we apply such strategy. Since Safari is our main browser (but we still use Chrome too) and we don't apply this strategy to batch backup, only some links are available. We don't apply such backup in stress situations too.

 

We don't implement this to our batch upload backup strategy at the moment.

 

Suggestion: a lot of pages accessed from other websites are archived like ...id918858936?mt=12 and other similar ? (example Mac App Store). We suggest you in such case to add ?rapplestyleroadtopetabytebackup or ?mt=12 and all other (if you know them), because you maybe get 0 results if no parameter are available in the url during a WebArchive search. The same applied to languages parameter (best example is TikTok): ?lang=de, ?jp, etc.

For us is not important who you are, where you live, what you do and other personal informations.

At the end we share all our browsing sessions (except some, like Business / Finance, or when we use Firefox/Safari or similar browsers) to WebArchive and other websites (this means to the public), so again: we backup webpages and we don't track you or sell your informations. Other websites can do such shit, not RoadToPetabyte. We are not interested on it and we will never change our idea. We are Archivist / DataHoarder, not Petyr Baelish, Erica Kravid, Amy GrandersonCage WallaceMarkZlatko AndronikovSt. John Dairy Farm familyOliver TraskSamuel SullivanDoctor Bailey and other double face clown.

 

Why i see a Perma.cc link?

This is another backup strategy used by RoadToPetabyte. Learn more about Perma.cc in the video or on the Harvard website

Why doesn't the Perma.cc preview work?

We know about this issue. We generally check every website before we share the link, but sometimes when we recheck the same link after hours and days, we don't see any preview anymore. In such case we change the link to the screenshot preview. If you find some missing previews, please click "See the Screenshot View" on the left or access the live website directly by clicking "View the live page".

If the live website is no more available, please use WebArchive and search for the same link. We generally backup all our visited webpages in double copy. If you get 0 results for the original link, please add ?rapplestyleroadtopetabytebackup to the link and search again. For example instead of searching www.a.com you need to search www.a.com?rapplestyleroadtopetabytebackup 

In the last shared post you don't need to do anything, because the links generally already contains ?rapplestyleroadtopetabytebackup, so in such case you just need to remove ?rapplestyleroadtopetabytebackup  if you don't get any result for that in WebArchive.

Even WebArchive doesn't work

Have you tried to search a website on WebArchive using both methods (with or without ?rapplestyleroadtopetabytebackup, after you checked the Perma.cc link), but you still get 0 results? This probably means both services deleted the entries, like we previously reported. In such case you could try to check the same link on Archive.is, but keep in mind that such service ban IP (by removing previous submitted links too) when you upload in batch. So you maybe find a result, but there is no guarantee.

Another good website is Time Travel, which can be used directly from a Perma.cc link. You can search on archive.today, Archive-It, Arquivo.pt: the Portuguese Web Archive, Bibliotheca Alexandrina Web Archive, DBpedia archive, DBpedia Triple Pattern Fragments archive, Canadian Government Web Archive, Croatian Web Archive, Estonian Web Archive, Icelandic web archive, Internet Archive, Library of Congress Web Archive, NARA Web Archive, National Library of Ireland Web Archive, National Records of Scotland, perma.cc, PRONI Web Archive, Slovenian Web Archive, Stanford Web Archive, UK Government Web Archive, UK Parliament's Web Archive, UK Web Archive, Web Archive Singapore, WebCite, Bayerische Staatsbibliothek at the same time.

Last option: reverse search Archive.is, WebArchive or Perma.cc links on all websites previously cited and you probably find a backup copy.

If you don't find a link we shared previously you can write an message to us too, maybe we archived such website using other methods (PDF, WebArchive, other services, ...) . Please don't write to us, if the link is not shared from us directly on our subreddits or websites. It should impossible to not find a backup copy, due to our complex backup strategy, but it still can happens.

Why do you use short links?

We use different types of short links (about 10-20). Again, even for that we are not interested in your data, we use such services to manage our links in much better way, so we can replace error links too (example if accounts / websites are banned / disabled / removed or other). Plus it help us to referring frequent websites much faster, and it's a clean way to write posts too. You can access to websites / extensions much faster too (even using 3. social networks), since behind the link creation there is generally a specific structure. Without considering the main goal of short links: to have a short link! So it doesn't matter if you see a bit.ly, cutt.ly, pixly.me or other link on our post, since we use all them in the same identical way, to relink to the official website and not to linking you to spam/malware websites. Sometimes we use short link services to show you a proper preview of the article, because a Perma.cc link just show the same identical preview for every single link, which would simply worse the user experience on Reddit.

 

We could use custom domain like rtp.ho (like a lot other websites do) to create such short links, so you have the idea we us our website directly. But not, we don't do that. We want to keep it so, so people know that. If you still have questions, you can reach us here (please note: we check the E-Mail only 1-2x pro week).

 

We generally don't use free short links services too (like the most of the people does), especially to avoid real malware, spam, ads, etc. We buy such services to provide high quality content with permanent backups.

 

Please keep in mind that not all our short links contain ?rapplestyleroadtopetabytebackup, especially if we created such url redirection before we implemented the new backup strategy. Sometimes we avoid to use it, because it generates view conflicts.

 

I don't like this project

If you still don't trust us, then you can search websites manually or use extensions like https://unshorten.link/. But remember, you get real tracked / analyzed by a lot other websites (just thinking all ?ref=producthunt, Mac App Store, etc. parameters available everywhere ...). We still respect your decision. If you see similar messages:


You can still try to use ?rapplestyleroadtopetabytebackup to check if a version has been archived. 

PS: remember to test ?ita ?fr ?de ?en or ?lang=it ?lang=de ?lang=en ?lang=fr etc. too, because some websites work in such way.

Do i need to know something other?

Sometimes you can reach a website in Italian, French or German, because we live in Switzerland. If we discover such errors, we try to fix it immediately. We generally choose USA as default country, with English as language. Pictures from our Macs are in Italian or German, because this is our mother language. For important tutorial we use an English Mac User. Generally we translate important messages in English. Same for not working redirection links. If we discover it, we try to fix it. Sometimes links are affiliate associated to pay days of working behind tutorials, search content material, try services for you and much more.

Please keep in mind that we use 10+ Macs in Sierra-Mojave. We use Big Sur and Catalina only via Parallels Desktop or bootable SSD, and we don't test directly such Apps, until we buy a Mac not compatible with Mojave anymore (2022 or so). Even if you read about APFS problems, we cannot confirm that such problems are available with Apple M1 (but we think they still haven't fixed it, from our experience we had with Apple Devs until now).

Who are you?

We are swiss DataHoarders, Editors (and more) using MacOS since 2009 on 10+ Macs and 300+ Mac Users with up to 10k+ Mac Apps. We use up to 5-10+ similar/alternative Mac Apps for the same topic/task, more than 50+ Mac Browsers, up to 400+ extensions and 15+ cloud services at the same time. We want to help you by sharing our experiences. In the last years we suggested over 6000+ improvements to 600+ Mac App / Browser or Web Devs too (from basic improvements, to pro usage features). We use MacOS in an exteme mode (that probably noone is doing, like we do, and all devs confirmed us that too), so we discover a lot of Mac Apps, MacOS limitations or other problems (especially since we repeat same tasks on 100x users). So we don't want to just share informations; we want help to improve MacApps, so you can enjoy such new features too.

We opened different Subreddits not related to MacOS/DataHoarding directly too, so that people can express their right to keep their accounts safe or share their ideas about important topics/implementations/features refused/removed/ignored by devs.