You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,
So here's a little issue.
Basically USP put all URL in lowercases, and as a result if the urls has some uppercase caracter, it no longer finds it.
Here's an example 👍
`from usp.tree import sitemap_tree_for_homepage
@Abdoulkadir-ali what's the other post? I have the same issue and don't see how to solve it. It just seems to lowercase the urls that are declared in the robots.txt
Hello,
So here's a little issue.
Basically USP put all URL in lowercases, and as a result if the urls has some uppercase caracter, it no longer finds it.
Here's an example 👍
`from usp.tree import sitemap_tree_for_homepage
tree = sitemap_tree_for_homepage('https://www.distriartisan.fr')`
The sitemaps urls are like this :
"https://www.distriartisan.fr/media/sitemap/sitemapProduitsAll_1.xml"
However in the logs it's written like this :
2023-06-02
12:42:12,823 INFO usp.fetch_parse [7776/MainThread]: Parsing sitemap from URL https://www.distriartisan.fr/media/sitemap/sitemapproduitsall_1.xml...2023-06-02 12:42:12,826 ERROR usp.fetch_parse [7776/MainThread]: Parsing sitemap from URL https://www.distriartisan.fr/media/sitemap/sitemapproduitsall_1.xml failed: Unsupported root element 'html'.`
The text was updated successfully, but these errors were encountered: