about summary refs log tree commit diff
Commit message (Collapse)AuthorAgeFilesLines
* Crawler+Indexer+Frontend: Rename structs to follow logical relationsBaitinq2022-10-293-24/+31
| | | | | | Now Resource is CrawledResource as it is created by the crawler, and the previous CrawledResource is now IndexedResource as its created by the indexer.
* Frontend: Use ResultComponent to display search resultsBaitinq2022-10-291-3/+14
|
* Indexer: Implement basic priority calculation of words in a siteBaitinq2022-10-291-7/+6
| | | | | We just calculate priority to be the number of occurences of the word in the site. This is very basic and should be changed:))
* Frontend: Show results in reverse order with priorityBaitinq2022-10-281-2/+2
|
* Frontend: Show result website's title and descriptionBaitinq2022-10-281-2/+4
|
* Crawler: Only accept HTTP_STATUS_CODE: 200 as success in crawl_url()Baitinq2022-10-281-3/+4
|
* Indexer: Add website title and description to the CrawledResourceBaitinq2022-10-283-1/+26
| | | | We now parse the HTML and extract the title and description of the site.
* Frontend: Refactor search_word_in_db() to not need explicit lifetimesBaitinq2022-10-281-6/+6
|
* Frontend: Improve responsive layoutBaitinq2022-10-281-34/+37
| | | | | We now kinda use flexbox i think? Needs lots of work regarding centering the search box. Kind of functional for now:)
* Frontend: Make the results state OptionalBaitinq2022-10-281-15/+27
| | | | | We now return "No result!" if the user has actually searched for something and no results were found.
* Misc: Add TODOsBaitinq2022-10-283-1/+3
|
* Frontend: Html: Set footer at the bottom of the pageBaitinq2022-10-281-36/+38
|
* Frontend: Logically structure htmlBaitinq2022-10-281-35/+39
|
* Frontend: Add local bootstrap filesBaitinq2022-10-283-5/+18
|
* Fronted: Order search results by priorityBaitinq2022-10-273-2/+20
| | | | | | We do this by implementing the PartialOrd and Ord traits into the CrawledResource struct and then using Itertools::sorted() on the display iterator.
* Frontend: Use display_results() function for rendering CrawledResourcesBaitinq2022-10-271-5/+14
|
* Frontend: Fetch results from indexerBaitinq2022-10-273-33/+53
|
* Crawler: Abstract database word fetching with search_word_in_db()Baitinq2022-10-271-2/+10
|
* Indexer: Add /search with no query endpointBaitinq2022-10-271-0/+6
| | | | Just returns [].
* Crawler: Replace String::from with .to_string()Baitinq2022-10-271-3/+6
|
* Indexer: Setup permissive CORSBaitinq2022-10-273-1/+21
|
* Indexer: Return json from the /search endpointBaitinq2022-10-273-7/+6
|
* Frontend: Add results field to the state and set dummy resultsBaitinq2022-10-261-2/+46
|
* Frontend: Add basic search_query stateBaitinq2022-10-263-8/+158
|
* Frontend: Add basic layoutBaitinq2022-10-262-1/+43
|
* Frontend: Update index.html to include boostrapBaitinq2022-10-251-3/+16
| | | | Also setup viewport and title.
* Crawler: Fix bad error handling with match handlingBaitinq2022-10-251-6/+9
|
* Crawler: Use async ClientBaitinq2022-10-254-48/+152
|
* Indexer: Use CrawledResource structure as values in the reverse index dbBaitinq2022-10-253-11/+45
| | | | This will allow us to integrate priorities and other improvements.
* Indexer: Add "correct" error handlingBaitinq2022-10-251-7/+7
|
* Crawler: Shuffle crawled urlsBaitinq2022-10-253-4/+5
|
* Crawler: Add "correct" error handlingBaitinq2022-10-251-21/+23
|
* Crawler: Parse urls with the "url" crateBaitinq2022-10-253-25/+26
| | | | | This fixes relative urls, makes url filtering and validation better, and many other improvements.
* Crawler: Add crawled url filterBaitinq2022-10-241-1/+8
| | | | This filters hrefs such as "/", "#" or "javascript:"
* Flake: Add rust-analyzer packageBaitinq2022-10-241-0/+1
|
* Crawler: Set queue size to 2222Baitinq2022-10-241-1/+1
|
* Misc: Update build/run instructionsBaitinq2022-10-241-2/+4
| | | | Now show how to run each module + yew frontend
* Client->Frontend: Create yew frontend skeletonBaitinq2022-10-248-14/+238
| | | | We have replaced the client with a yew frontend.
* Crawler+Indexer: Rust cleanupBaitinq2022-10-232-14/+6
| | | | | | Getting more familiar with the language so fixed some non optimal into_iter() usage, unnecessary .clone()s and unnecessary hack when we could just get a &mut for inserting into the indexer url database.
* Crawler: Replace println! with dbg!Baitinq2022-10-231-7/+7
|
* Crawler: Remove prepending of https:// to each urlBaitinq2022-10-232-1006/+1006
| | | | | We now prepend it to the top-1000-urls list. This fixes crawled urls having two https://
* Crawler: Only crawl 2 urls per urlBaitinq2022-10-231-0/+6
| | | | This makes it so that we dont get rate limited from websites.
* Crawler: Change blockingqueue to channelsBaitinq2022-10-233-19/+45
| | | | | We now use the async-channel channels implementation. This allows us to have bounded async channels.
* Indexer: Listen on 0.0.0.0Baitinq2022-10-231-1/+1
|
* Indexer: Implement basic reverse index searching and addingBaitinq2022-10-223-15/+163
| | | | Very inefficient but kind of functional:::)))))))
* Crawler: Implement basic async functionalityBaitinq2022-10-223-93/+285
|
* Crawler: Add basic indexer communicationBaitinq2022-10-212-11/+48
|
* Indexer: Add skeleton http rest endpoint functionalityBaitinq2022-10-213-1/+539
| | | | /search and /resource endpoint.
* Crawler: Add Err string in the craw_url methodBaitinq2022-10-201-3/+3
|
* Crawler: Add indexer interaction skeletonBaitinq2022-10-201-1/+5
|