about summary refs log tree commit diff
path: root/frontend (unfollow)
Commit message (Collapse)AuthorFilesLines
2022-10-30Frontend: Change OSSE component into a struct componentBaitinq1-76/+88
I think this improves readability for components with state.
2022-10-29Crawler+Indexer+Frontend: Rename structs to follow logical relationsBaitinq3-24/+31
Now Resource is CrawledResource as it is created by the crawler, and the previous CrawledResource is now IndexedResource as its created by the indexer.
2022-10-29Frontend: Use ResultComponent to display search resultsBaitinq1-3/+14
2022-10-29Indexer: Implement basic priority calculation of words in a siteBaitinq1-7/+6
We just calculate priority to be the number of occurences of the word in the site. This is very basic and should be changed:))
2022-10-28Frontend: Show results in reverse order with priorityBaitinq1-2/+2
2022-10-28Frontend: Show result website's title and descriptionBaitinq1-2/+4
2022-10-28Crawler: Only accept HTTP_STATUS_CODE: 200 as success in crawl_url()Baitinq1-3/+4
2022-10-28Indexer: Add website title and description to the CrawledResourceBaitinq3-1/+26
We now parse the HTML and extract the title and description of the site.
2022-10-28Frontend: Refactor search_word_in_db() to not need explicit lifetimesBaitinq1-6/+6
2022-10-28Frontend: Improve responsive layoutBaitinq1-34/+37
We now kinda use flexbox i think? Needs lots of work regarding centering the search box. Kind of functional for now:)
2022-10-28Frontend: Make the results state OptionalBaitinq1-15/+27
We now return "No result!" if the user has actually searched for something and no results were found.
2022-10-28Misc: Add TODOsBaitinq3-1/+3
2022-10-28Frontend: Html: Set footer at the bottom of the pageBaitinq1-36/+38
2022-10-28Frontend: Logically structure htmlBaitinq1-35/+39
2022-10-28Frontend: Add local bootstrap filesBaitinq3-5/+18
2022-10-27Fronted: Order search results by priorityBaitinq3-2/+20
We do this by implementing the PartialOrd and Ord traits into the CrawledResource struct and then using Itertools::sorted() on the display iterator.
2022-10-27Frontend: Use display_results() function for rendering CrawledResourcesBaitinq1-5/+14
2022-10-27Frontend: Fetch results from indexerBaitinq3-33/+53
2022-10-27Crawler: Abstract database word fetching with search_word_in_db()Baitinq1-2/+10
2022-10-27Indexer: Add /search with no query endpointBaitinq1-0/+6
Just returns [].
2022-10-27Crawler: Replace String::from with .to_string()Baitinq1-3/+6
2022-10-27Indexer: Setup permissive CORSBaitinq3-1/+21
2022-10-27Indexer: Return json from the /search endpointBaitinq3-7/+6
2022-10-26Frontend: Add results field to the state and set dummy resultsBaitinq1-2/+46
2022-10-26Frontend: Add basic search_query stateBaitinq3-8/+158
2022-10-26Frontend: Add basic layoutBaitinq2-1/+43
2022-10-25Frontend: Update index.html to include boostrapBaitinq1-3/+16
Also setup viewport and title.
2022-10-25Crawler: Fix bad error handling with match handlingBaitinq1-6/+9
2022-10-25Crawler: Use async ClientBaitinq4-48/+152
2022-10-25Indexer: Use CrawledResource structure as values in the reverse index dbBaitinq3-11/+45
This will allow us to integrate priorities and other improvements.
2022-10-25Indexer: Add "correct" error handlingBaitinq1-7/+7
2022-10-25Crawler: Shuffle crawled urlsBaitinq3-4/+5
2022-10-25Crawler: Add "correct" error handlingBaitinq1-21/+23
2022-10-25Crawler: Parse urls with the "url" crateBaitinq3-25/+26
This fixes relative urls, makes url filtering and validation better, and many other improvements.
2022-10-24Crawler: Add crawled url filterBaitinq1-1/+8
This filters hrefs such as "/", "#" or "javascript:"
2022-10-24Flake: Add rust-analyzer packageBaitinq1-0/+1
2022-10-24Crawler: Set queue size to 2222Baitinq1-1/+1
2022-10-24Misc: Update build/run instructionsBaitinq1-2/+4
Now show how to run each module + yew frontend
2022-10-24Client->Frontend: Create yew frontend skeletonBaitinq8-14/+238
We have replaced the client with a yew frontend.
2022-10-23Crawler+Indexer: Rust cleanupBaitinq2-14/+6
Getting more familiar with the language so fixed some non optimal into_iter() usage, unnecessary .clone()s and unnecessary hack when we could just get a &mut for inserting into the indexer url database.
2022-10-23Crawler: Replace println! with dbg!Baitinq1-7/+7
2022-10-23Crawler: Remove prepending of https:// to each urlBaitinq2-1006/+1006
We now prepend it to the top-1000-urls list. This fixes crawled urls having two https://
2022-10-23Crawler: Only crawl 2 urls per urlBaitinq1-0/+6
This makes it so that we dont get rate limited from websites.
2022-10-23Crawler: Change blockingqueue to channelsBaitinq3-19/+45
We now use the async-channel channels implementation. This allows us to have bounded async channels.
2022-10-23Indexer: Listen on 0.0.0.0Baitinq1-1/+1
2022-10-22Indexer: Implement basic reverse index searching and addingBaitinq3-15/+163
Very inefficient but kind of functional:::)))))))
2022-10-22Crawler: Implement basic async functionalityBaitinq3-93/+285
2022-10-21Crawler: Add basic indexer communicationBaitinq2-11/+48
2022-10-21Indexer: Add skeleton http rest endpoint functionalityBaitinq3-1/+539
/search and /resource endpoint.
2022-10-20Crawler: Add Err string in the craw_url methodBaitinq1-3/+3