Monday, October 18, 2010

It all looks so......




Yay familiars

4 comments:

sturtus said...

Your software has worked like a charm, much thanks to the FG post here: http://www.fantasygrounds.com/forums/showthread.php?t=11322&highlight=library

Having followed the steps outlined there, it begs the question: if the process is basically the same for each scrape and parse, why not just automate it for each book? have the user simply enter a temp directory, the book they'd like to scrape, choose what kind of module they'd like (db.xml, common.xml, client.xml), and their DDI creds then just click go? It makes perfect sense from the developer's POV to want access to these options, but you could possibly have a Basic and Advanced mode. It's a lot to ask from somebody who has already put in so much work, and I'm plowing through these books, but thought I'd just see what your thoughts were.

Thanks for all the hard work, you've gotten me to switch from MapTool to FG for my online campaign.

J said...

The parser has changed a lot since it's humble beginnings. At one point there were Programs for each book and the users had to enter in the required files (up to 13 of them) by hand.
There are two reasons I do not have a simplified interface:
1) Making it too automatic might be frowned upon by wotc.
2) I dont find the results of the extraction process to be useful. I would never use a module directly from the compendium. And to get a module to the minimum level I find acceptable requires the majority of the options

Johnaton Petersen said...

Noticed a few expceptions which cause the app to crash recently although previous scrapes have worked before. Ones that are crashing at the moment H1, Dungoen masters Guide.

Also there are links (a href) in a lot of places missing close tags and some close p tags going on new lines that are causing parse errors. especially in the players handbooks.

J said...

The new version fixes a lot of the href issues. The PHB, PHB3, H1 and DMG all work without issue.

Rescraping modules you've already scraped once is extremely wasteful.