| Commit message (Collapse) | Author | Age | Files | Lines |
| ... | |
| |
|
|
|
| |
We now normalise urls starting with / (relative to root) and //
(relative to protocol)
|
| | |
|
| |
|
|
|
| |
Extremely basic implementation. Needs max queue size, error handling,
formatting of parsed links.
|
| |
|
|
|
| |
Starts by filling a queue with the top 1000 most visited sites. "Crawls"
each one (empty fn), and blocks for new elements on the queue.
|
| |
|
|
| |
This fixes vscode not being able to find rust-analyzer and rust-src
|
| |
|
|
|
| |
We now have a cargo workspace with the Crawler, Client and Indexer
packages.
|
|
|
This is the initial commit for this experiment of a search engine. I
hope I can learn a lot from this!
|