You're doing it right -- it's just slow, because the added abstraction of the ORM means you can't make the sorts of optimizations you'd like.
That said, the EntityManager does get slow on transactions that large. If you don't absolutely need them all in one big transaction, you can probably get more performant code by flush()ing and then clear()ing the EM every 20-200 iterations of your loop.
If that doesn't get you enough performance, the only alternative that I can think of is to revert to custom code that runs custom SQL directly against your DBMS.
I know this isn't a great answer, but at least I can tell you that you're not crazy.
------ edit ------
From official Doctrine2 article on Batch processing:
Some people seem to be wondering why Doctrine does not use
multi-inserts (insert into (...) values (...), (...), (...), ...
First of all, this syntax is only supported on mysql and newer
postgresql versions. Secondly, there is no easy way to get hold of all
the generated identifiers in such a multi-insert when using
AUTO_INCREMENT or SERIAL and an ORM needs the identifiers for identity
management of the objects. Lastly, insert performance is rarely the
bottleneck of an ORM. Normal inserts are more than fast enough for
most situations and if you really want to do fast bulk inserts, then a
multi-insert is not the best way anyway, i.e. Postgres COPY or Mysql
LOAD DATA INFILE are several orders of magnitude faster.
These are the reasons why it is not worth the effort to implement an
abstraction that performs multi-inserts on mysql and postgresql in an
ORM.
Also there is a significant difference in performance when using remote vs local database as overhead of sending each query to remote server is quite large. The overhead is much lower while using local database thanks to transactions and DB optimizations. (e.g. 70sec lowered to 300ms in the case of example in the question)
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…