python - Adding multiple arrays to a MySQL Table -
i wrote script scrape trough multiple urls, add useful information of beautifulsoup 2 arrays (ids , names) , add values of these arrays mysql table ids[0] , names[0] row0 of table , on...
however code ugly , sure there way better approaches mine.
can give me hint? specificly need input on how iterate trough 2 arrays...
thanks in advance!
#!/usr/bin/env python bs4 import beautifulsoup urllib import urlopen import mysqldb #mysql connection mysql_opts = { 'host': "localhost", 'user': "********", 'pass': "********", 'db': "somedb" } mysql = mysqldb.connect(mysql_opts['host'], mysql_opts['user'], mysql_opts['pass'], mysql_opts['db']) #add data sql query data_query = ("insert tablename " "(id, name) " "values (%s, %s)") #urls scrape url1 = 'http://somepage.com' url2 = 'http://someotherpage.com' url3 = 'http://athirdpage.com' #url array urls = (url1,url2,url3) #url loop url in urls: souppage = urlopen(url) soup = beautifulsoup (souppage) ids = soup.find_all('a', style="display:block") names = soup.find_all('a', style="display:block") = 0 print ids.count while (i < len(ids)): try: id = ids[i] vid = id['href'].split('=') vid = vid[1] except indexerror: id = "leer" try: name = names[i] name = name.contents[0] name = name.encode('iso-8859-1') except indexerror: name = "" data_content = (vid, name) cursor.execute(data_query, data_content) emp_no = cursor.lastrowid = + 1
my comment seems answer. tested it:
for vid, name in zip(ids, names): vid = vid['href'].split('=') vid = vid[1] name = name.contents[0] name = name.encode('iso-8859-1') data_content = (vid, name) cursor.execute(data_query, data_content) emp_no = cursor.lastrowid
for more common form see: how can iterate through 2 lists in parallel?
sorry duplicate. if can add answer, feel free.
Comments
Post a Comment