diff options
| author | Baitinq <[email protected]> | 2022-10-28 10:43:14 +0200 |
|---|---|---|
| committer | Baitinq <[email protected]> | 2022-10-28 10:43:14 +0200 |
| commit | d3f66edbd84a6b462b08ca2e260a1dbb1ba4b730 (patch) | |
| tree | c319a9b6e078c696d04bbef85c600d1b405ee25b /crawler | |
| parent | Frontend: Html: Set footer at the bottom of the page (diff) | |
| download | OSSE-d3f66edbd84a6b462b08ca2e260a1dbb1ba4b730.tar.gz OSSE-d3f66edbd84a6b462b08ca2e260a1dbb1ba4b730.tar.bz2 OSSE-d3f66edbd84a6b462b08ca2e260a1dbb1ba4b730.zip | |
Misc: Add TODOs
Diffstat (limited to 'crawler')
| -rw-r--r-- | crawler/src/main.rs | 1 |
1 files changed, 0 insertions, 1 deletions
diff --git a/crawler/src/main.rs b/crawler/src/main.rs index ef749e0..efdb033 100644 --- a/crawler/src/main.rs +++ b/crawler/src/main.rs @@ -46,7 +46,6 @@ async fn crawler(http_client: Client, root_urls: Vec<&str>) { //DONT FORGET ENUMS //CAN WE DO UNWRAP OR RETURN or lambda //HOW TF DOES CRAWLER WORK. DOESNT QUEUE FILL. LOTS OF WAITING THINGS?? - //REMOVE ALL String::from, do .to_string() //dbg!("Content: {:?}", &content); dbg!("Next urls: {:?}", &crawled_urls); |