And how does that query look like? Are you trying to insert all those 100 million rows in one statement?
When you are trying to copy that much data it is imperative that you do it in batches. So if you want to do it with OPENQUERY, take like five million rows at a time. Define your batches in such a way that there is an index to follow on the Oracle side. Or, well, hm, for SQL Server the best is to use a clustered index, but clustered indexes are not very hip on the Oracle side. But maybe Oracle permits you to read by row id or something like that? In any case, you don't want a scan for each batch.
But I would also look into export all to files and import them to SQL Server with BCP or BULK INSERT. Both these permits you to set a batch size, so that there is a commit after each five million rows or whatever.