This coding is hitting the DB a lot. Is there a way to reduce the number of DB hits by grouping them together? If not in Django is it possible with SQL? My development machine is SQLite and production is PostgreSQL. If possible with SQL, please give me a few hints of where to get started.
class Sensor(models.Model):
Name = models.CharField( max_length=200 )
Value = models.FloatField()
class DataPoint(BaseModel):
Taken_datetime = models.DateTimeField( blank=True, null=True )
Sensors = models.ManyToManyField( SensorVal, blank=True, null=True )
for row in rows:
dp = DataPoint.objects.get(Taken_datetime=row['date'])
sensorToAdd = []
for sensor in sensors:
s = Sensor.objects.get(Name=sensor.name, Value=sensor.value )
sensorToAdd.append( s )
dp.Sensors.add( sensorToAdd )
All the data is stored in a cvs file, so I know all of it at the start.
For each row, the code hits the DB to load DataPoint, load the Sensors, and attach the sensors to the DataPoint. I'm looking for something like bulk_create, but for the m2m field. All the solutions I've found have used the same method I'm using above. The problem I'm running into is that there is a lot of time DataPoints, and I'm hitting the DB a lot of individual times. I'd like to group all these together and do a few DB calls.
If there is a better way to model the data without making the DB larger? I'd be open to that.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…