about summary refs log tree commit diff
path: root/Cargo.lock (follow)
Commit message (Collapse)AuthorAgeFilesLines
* Indexer: Add website title and description to the CrawledResourceBaitinq2022-10-281-0/+1
| | | | We now parse the HTML and extract the title and description of the site.
* Fronted: Order search results by priorityBaitinq2022-10-271-0/+1
| | | | | | We do this by implementing the PartialOrd and Ord traits into the CrawledResource struct and then using Itertools::sorted() on the display iterator.
* Frontend: Fetch results from indexerBaitinq2022-10-271-0/+3
|
* Indexer: Setup permissive CORSBaitinq2022-10-271-0/+16
|
* Indexer: Return json from the /search endpointBaitinq2022-10-271-0/+1
|
* Frontend: Add basic search_query stateBaitinq2022-10-261-1/+103
|
* Crawler: Use async ClientBaitinq2022-10-251-40/+139
|
* Indexer: Use CrawledResource structure as values in the reverse index dbBaitinq2022-10-251-0/+1
| | | | This will allow us to integrate priorities and other improvements.
* Crawler: Shuffle crawled urlsBaitinq2022-10-251-0/+1
|
* Crawler: Parse urls with the "url" crateBaitinq2022-10-251-0/+1
| | | | | This fixes relative urls, makes url filtering and validation better, and many other improvements.
* Client->Frontend: Create yew frontend skeletonBaitinq2022-10-241-4/+215
| | | | We have replaced the client with a yew frontend.
* Crawler: Change blockingqueue to channelsBaitinq2022-10-231-7/+33
| | | | | We now use the async-channel channels implementation. This allows us to have bounded async channels.
* Indexer: Implement basic reverse index searching and addingBaitinq2022-10-221-7/+80
| | | | Very inefficient but kind of functional:::)))))))
* Crawler: Implement basic async functionalityBaitinq2022-10-221-54/+239
|
* Indexer: Add skeleton http rest endpoint functionalityBaitinq2022-10-211-0/+506
| | | | /search and /resource endpoint.
* Crawler: Remove duplicate parsed urlsBaitinq2022-10-201-0/+16
|
* Crawler: Add basic html parsing and link-followingBaitinq2022-10-201-0/+1525
| | | | | Extremely basic implementation. Needs max queue size, error handling, formatting of parsed links.
* Crawler: Add skeleton crawler implementationBaitinq2022-10-201-0/+9
| | | | | Starts by filling a queue with the top 1000 most visited sites. "Crawls" each one (empty fn), and blocks for new elements on the queue.
* Misc: Separate OSSE into componentsBaitinq2022-10-191-1/+9
| | | | | We now have a cargo workspace with the Crawler, Client and Indexer packages.
* Initial Commit!Baitinq2022-10-191-0/+7
This is the initial commit for this experiment of a search engine. I hope I can learn a lot from this!