Replies: 1 comment
-
Created an experiments branch to try a different approach. From looking at: https://learn.deeplearning.ai/courses/multi-ai-agent-systems-with-crewai, it seems like there might be a 1:N issue. I'm attempting to read several text inputs to generate N search strings. I want to use Serper to generate M resulting page URLs of related content. Then, I would like to scrape each of the M pages saving to uniquely named files resulting in NxM files. From the examples I've seen so far, each task has an output_file parameter which implies only a single output can be saved using this approach whereas I need NxM files to be written. In https://github.com/tjGecko/f06_whitepaper_writer/blob/experiments/src/f06_whitepaper_writer/main.py, I tried using a listener to capture the individual pydantic outputs to write to file. This resulted in only writing a single file. I also tried using interpolation to fill the "output_file" parameter via the tasks.yaml, but so far it's not working for me. Ideas? Current exception trace: entry: title='GREENER principles for environmentally sustainable computational ...' url='https://www.nature.com/articles/s43588-023-00461-y' scraped_content='Carbon intensities depend largely on geographical location, with up to three orders of magnitude between the top and bottom performing high ...' [Flow._execute_single_listener] Error in method scrape_pages: 1 validation error for Task output_file Input should be a valid string [type=string_type, input_value={'save_file': None}, input_type=dict]
Traceback (most recent call last): File "/projects/f06_whitepaper_writer/.venv/lib/python3.10/site-packages/crewai/flow/flow.py", line 652, in _execute_single_listener
File "/projects/f06_whitepaper_writer/.venv/lib/python3.10/site-packages/crewai/flow/flow.py", line 493, in _execute_method
File "/projects/f06_whitepaper_writer/src/f06_whitepaper_writer/main.py", line 42, in scrape_pages
File "/projects/f06_whitepaper_writer/.venv/lib/python3.10/site-packages/crewai/project/utils.py", line 11, in memoized_func
File "/projects/f06_whitepaper_writer/.venv/lib/python3.10/site-packages/crewai/project/annotations.py", line 95, in wrapper
File "/projects/f06_whitepaper_writer/.venv/lib/python3.10/site-packages/crewai/project/utils.py", line 11, in memoized_func
File "/projects/f06_whitepaper_writer/.venv/lib/python3.10/site-packages/crewai/project/annotations.py", line 28, in wrapper
File "/projects/f06_whitepaper_writer/src/f06_whitepaper_writer/crews/c03_scraper/c03_scraper.py", line 46, in page_scraper_task
File "/projects/f06_whitepaper_writer/.venv/lib/python3.10/site-packages/pydantic/main.py", line 214, in init
pydantic_core._pydantic_core.ValidationError: 1 validation error for Task output_file Input should be a valid string [type=string_type, input_value={'save_file': None}, input_type=dict]
|
Beta Was this translation helpful? Give feedback.
-
I'm creating a bot to help write whitepapers. Currently, I'm running into problems passing arguments to custom tools (see: save_research_results). Here's what I'm seeing:
99 - Console output.md
The intent of the project is to:
https://github.com/tjGecko/f06_whitepaper_writer/blob/master/src/f06_whitepaper_writer/main.py
https://github.com/tjGecko/f06_whitepaper_writer/blob/master/src/f06_whitepaper_writer/crews/c02_crawler/c02_crawler.py
I loosely based this on Matthew Berman's example: https://github.com/mberman84/edu-crew/tree/main/src/edu_flow
Beta Was this translation helpful? Give feedback.
All reactions