Awesome Lists with GitHub stars

StumbleUponAwesome

An awesome internet discovery button, for developers, tech and science lovers.

Mentioned in Awesome

A browser extension that takes you to a random site from one of the awesome curated lists GitHub Repo Stars GitHub last commit. Like good ol’ StumbleUpon (which is now dead).

⚡️ Install Chrome Extension ⚡️ Install Firefox Add-on

There are 45,787 unique sites from 554 awesome lists on Github from kind contributors. There’s some hidden gems waiting in there 💎 .


How to use it:

To stumble: Simply click on the ⚡️ extension button → go to a new awesome site!

(or use **`Alt`** + **`Shift`** +**`S`**)


꩜ Introducing: The Rabbit Hole

We have all been down internet rabbit holes.
.
One minute you’re casually reading the news, the next you’ve read so much about random topic you might as well do a TED talk.
.
What just happened? The rabbit hole pulled you in and you lost track of time, but you also might have discovered something awesome.
.
So why not embrace it, by having a fancy button for it, obviously.

Stay stumblin’ on the same topic, or exit back to random mode.


Setup

  1. Clone or fork this repository
  2. Open Chrome/Brave or other Chromium-based browser
  3. Open the extensions page at chrome://extensions
  4. Enable developer mode
  5. Click “Load unpacked” and select the /extension folder.

Development

Here’s some of the things I’d like to build out for this extension. However the main one right now is simply to curate the links as good as I can, add more data sources and make sure the pages are a good mix of interesting, useful, fun and exciting.

→ Changelog

A note about permissions

This extension requires the <all_urls> permission, in order to show the overlay UI on every stumble page that you visit. It does not access data on these sites. There is no tracking, or analytics of any kind, and state is only stored locally.

Credit to the curators ✔

This extension is made possible by awesome people curating the internet:

A note about the dataset

It’s completely local - you can find it under /extension/data. It’s generated with awesome_scraper.py.

Maintaining quality

To make sure that every link works and is relevant, the dataset is cleaned. Any dead or broken links are removed, as well as links to CI pipelines, recursive links, donation links, etc. This is done with the cleanup functions in utils.py. Running this script can take a few hours on a slow connection.

After removing from the dataset, a record of dead or broken links (those with 404, SSL, other server errors) is saved in these text files after every scrape.

❗️If you are one of the awesome list maintainers, find the text file for your awesome-list to check for dead links and remove them from your list, or update with a valid URL. If the file is empty, all good!

Contribute

☝️Submit an issue GitHub Repo Stars GitHub last commit 🤘Submit a PR GitHub Repo Stars GitHub last commit

✨ Stay curious!