Commit message (Collapse) | Author | Age | Files | Lines | |
---|---|---|---|---|---|
* | Indexer: Add website title and description to the CrawledResource | Baitinq | 2022-10-28 | 1 | -0/+1 |
| | | | | We now parse the HTML and extract the title and description of the site. | ||||
* | Fronted: Order search results by priority | Baitinq | 2022-10-27 | 1 | -0/+1 |
| | | | | | | We do this by implementing the PartialOrd and Ord traits into the CrawledResource struct and then using Itertools::sorted() on the display iterator. | ||||
* | Frontend: Fetch results from indexer | Baitinq | 2022-10-27 | 1 | -0/+3 |
| | |||||
* | Indexer: Setup permissive CORS | Baitinq | 2022-10-27 | 1 | -0/+16 |
| | |||||
* | Indexer: Return json from the /search endpoint | Baitinq | 2022-10-27 | 1 | -0/+1 |
| | |||||
* | Frontend: Add basic search_query state | Baitinq | 2022-10-26 | 1 | -1/+103 |
| | |||||
* | Crawler: Use async Client | Baitinq | 2022-10-25 | 1 | -40/+139 |
| | |||||
* | Indexer: Use CrawledResource structure as values in the reverse index db | Baitinq | 2022-10-25 | 1 | -0/+1 |
| | | | | This will allow us to integrate priorities and other improvements. | ||||
* | Crawler: Shuffle crawled urls | Baitinq | 2022-10-25 | 1 | -0/+1 |
| | |||||
* | Crawler: Parse urls with the "url" crate | Baitinq | 2022-10-25 | 1 | -0/+1 |
| | | | | | This fixes relative urls, makes url filtering and validation better, and many other improvements. | ||||
* | Client->Frontend: Create yew frontend skeleton | Baitinq | 2022-10-24 | 1 | -4/+215 |
| | | | | We have replaced the client with a yew frontend. | ||||
* | Crawler: Change blockingqueue to channels | Baitinq | 2022-10-23 | 1 | -7/+33 |
| | | | | | We now use the async-channel channels implementation. This allows us to have bounded async channels. | ||||
* | Indexer: Implement basic reverse index searching and adding | Baitinq | 2022-10-22 | 1 | -7/+80 |
| | | | | Very inefficient but kind of functional:::))))))) | ||||
* | Crawler: Implement basic async functionality | Baitinq | 2022-10-22 | 1 | -54/+239 |
| | |||||
* | Indexer: Add skeleton http rest endpoint functionality | Baitinq | 2022-10-21 | 1 | -0/+506 |
| | | | | /search and /resource endpoint. | ||||
* | Crawler: Remove duplicate parsed urls | Baitinq | 2022-10-20 | 1 | -0/+16 |
| | |||||
* | Crawler: Add basic html parsing and link-following | Baitinq | 2022-10-20 | 1 | -0/+1525 |
| | | | | | Extremely basic implementation. Needs max queue size, error handling, formatting of parsed links. | ||||
* | Crawler: Add skeleton crawler implementation | Baitinq | 2022-10-20 | 1 | -0/+9 |
| | | | | | Starts by filling a queue with the top 1000 most visited sites. "Crawls" each one (empty fn), and blocks for new elements on the queue. | ||||
* | Misc: Separate OSSE into components | Baitinq | 2022-10-19 | 1 | -1/+9 |
| | | | | | We now have a cargo workspace with the Crawler, Client and Indexer packages. | ||||
* | Initial Commit! | Baitinq | 2022-10-19 | 1 | -0/+7 |
This is the initial commit for this experiment of a search engine. I hope I can learn a lot from this! |