Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

issue with \screener.py #161

Open
sheppegr opened this issue Apr 11, 2023 · 5 comments
Open

issue with \screener.py #161

sheppegr opened this issue Apr 11, 2023 · 5 comments

Comments

@sheppegr
Copy link

I have been trying to just pull a stock list of all stocks on Finviz.com and my script was working just fine.
Then all of a sudden it stopped with this error.
finviz\screener.py", line 128, in init
self.data = self.__search_screener()

Research says it is something to do with starting with 0 versus a 1, but i cannot find it in the code.

here is the script i am running.

`from finviz.screener import Screener
import datetime

stock_list = Screener()
print(type(stock_list))
stock_list.to_csv('finviz_stocks_list'+(datetime.datetime.now().date().strftime('%m-%d-%Y')+'.csv'))
`

@sheppegr
Copy link
Author

this was fixed in the current Master release. Thank you. I downloaded it and manually replaced the scripts. works well. Thank you

@Fachastorm
Copy link
Contributor

I have opened a pull request with a fix as this issue continues.

@DubraDave
Copy link

DubraDave commented Apr 21, 2023

I am also seeing the IndexError on initialization. How do I pip install the pull request with the fix?

@sheppegr
Copy link
Author

Thank you, please keep me posted

@DubraDave
Copy link

DubraDave commented Apr 23, 2023

`~\anaconda3\lib\site-packages\finviz\screener.py in init(self, tickers, filters, rows, order, signal, table, custom, user_agent, request_method)
126
127 self.analysis = []
--> 128 self.data = self.__search_screener()
129
130 def call(

~\anaconda3\lib\site-packages\finviz\screener.py in __search_screener(self)
434 )
435
--> 436 self._rows = self.__check_rows()
437 self.headers = self.__get_table_headers()
438

~\anaconda3\lib\site-packages\finviz\screener.py in __check_rows(self)
405
406 if self._total_rows == 0:
--> 407 raise NoResults(self._url.split("?")[1])
408 elif self._rows is None or self._rows > self._total_rows:
409 return self._total_rows

NoResults: No results found for query: v=141&t=&f=exch_nasd%2Cidx_sp500&o=price&s=&c=`

The core issue appears to be that the scrape module in screener.py is no longer functioning properly. In the check_rows function, total_rows is set to scrape.get_total_rows. which is returning no result. The issue appears to be in the scraper_functions.py module.

I suspect, but have not investigated deeply enough yet that the get_table function in scraper_functions.py is malfunctioning due to a new html structure at Finviz.

Here is the current code of that function:

`
def get_table(page_html: requests.Response, headers, rows=None, **kwargs):
""" Private function used to return table data inside a list of dictionaries. """
if isinstance(page_html, str):
page_parsed = html.fromstring(page_html)
else:
page_parsed = html.fromstring(page_html.text)
# When we call this method from Portfolio we don't fill the rows argument.
# Conversely, we always fill the rows argument when we call this method from Screener.
# Also, in the portfolio page, we don't need the last row - it's redundant.
if rows is None:
rows = -2 # We'll increment it later (-1) and use it to cut the last row

data_sets = []
# Select the HTML of the rows and append each column text to a list
all_rows = [
    column.xpath("td//text()")
    for column in page_parsed.cssselect('tr[valign="top"]')
]

# If rows is different from -2, this function is called from Screener
if rows != -2:
    for row_number, row_data in enumerate(all_rows, 1):
        data_sets.append(dict(zip(headers, row_data)))
        if row_number == rows:  # If we have reached the required end
            break
else:
    # Zip each row values to the headers and append them to data_sets
    [data_sets.append(dict(zip(headers, row))) for row in all_rows]

return data_sets

`

I am guessing all_rows is likely where the problem resides.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants