Writing Json in for loop in Python -


i downloading json files api, use following code write json. each item loop gives me json file. need save , extract entities appended json file using loop.

for item in style_ls:     dat = get_json(api, item)     specs_dict[item] = dat     open("specs_append.txt", "a") myfile:         json.dump(dat, myfile)         myfile.close()     print item  open ("specs_data.txt", "w") file:     json.dump(spec_dict, myfile)     myfile.close() 

i know cannot valid json format specs_append.txt, can 1 specs_data.txt. doing first 1 because program needs atleast 3-4 days complete , there high chances system may shutdown. there anyway can efficiently ?

if not there anyway can extract specs_append.txt <{json}{json}> format (which not valid json format)?

if not should write specs_dict txt file every time in loop, if program gets terminated can start if point in loop , still valid json format?

i suggest several possible solutions.

one solution write custom code slurp in input file. suggest putting special line before each json object in file, such as: ###

then write code this:

import json  def json_get_objects(f):     temp = ''     line = next(f)  # pull first line     assert line == special_line      line in f:         if line != special_line:             temp += line         else:             # found special marker, temp contains complete json object             j = json.loads(temp)             yield j             temp = ''     # after loop done, yield last json object     if temp:         j = json.loads(temp)         yield j  open("specs_data.txt", "r") f:     j in json_get_objects(f):         pass # json object j 

two notes on this. first, appending string on , over; used slow way in python, if using old version of python, don't way unless json objects small. second, wrote code split input , yield json objects 1 @ time, use guaranteed-unique string, slurp in data single call f.read() , split on guaranteed-unique string using str.split() method function.

another solution write whole file valid json list of valid json objects. write file this:

{"mylist":[ # first json object, followed comma # second json object, followed comma # third json object ]} 

this require file appending code open file writing permission, , seek last ] in file before writing comma plus newline, new json object on end, , writing ]} close out file. if way, can use json.loads() slurp whole thing in , have list of json objects.

finally, suggest maybe should use database. use sqlite or , throw json strings in table. if choose this, suggest using orm make life simple, rather writing sql commands hand.

personally, favor first suggestion: write in special line ###, have custom code split input on marks , json objects.

edit: okay, first suggestion sort of assuming json formatted human readability, bunch of short lines:

{     "foo": 0,     "bar": 1,     "baz": 2 } 

but it's run 1 big long line:

{"foo":0,"bar":1,"baz":2} 

here 3 ways fix this.

0) write newline before ### , after it, so:

### {"foo":0,"bar":1,"baz":2} ### {"foo":0,"bar":1,"baz":2} 

then each input line alternately ### or complete json object.

1) long special_line unique (never appears inside string in json) can this:

with open("specs_data.txt", "r") f:     temp = f.read()  # read entire file contents     lst = temp.split(special_line)     json_objects = [json.loads(x) x in lst]     j in json_objects:         pass # json object j 

the .split() method function can split temp string json objects you.

2) if each json object never have newline character inside it, write json objects file, 1 after another, putting newline after each; assume each line json object:

import json  def json_get_objects(f):     line in f:         if line.strip():             yield json.loads(line)  open("specs_data.txt", "r") f:     j in json_get_objects(f):         pass # json object j 

i simplicity of option (2), reliability of option (0). if newline ever got written in part of json object, option (0) still work, option (2) error.

again, can use actual database (sqlite) orm , let database worry details.

good luck.


Comments

Popular posts from this blog

google api - Incomplete response from Gmail API threads.list -

Installing Android SQLite Asset Helper -

Qt Creator - Searching files with Locator including folder -