we are recently looking at the feasibility of refactoring our BigQuery streaming process, which is a traditional .NET Windows console application, into .NET Core, so that it can be distributed in containers and they can be easily orchestrated on Google Container Engine.
When do my research online recently, I found an issue has been brought up here, saying that ServiceAccountCredential does not support .NET Core yet. This is actually a deal breaker.
Wondering any plans for .NET core support now? Except for ServiceAccountCredential, are there any other stuff regarding to BigQuery Api that are not supported in .NET core?
Just FYI, following is how our process create the credential. Any workaround available?
private void InitBigQueryService()
{
_credential = new ServiceAccountCredential
(
new ServiceAccountCredential.Initializer(_account.ClientEmailId)
{
Scopes = new[] { BigqueryService.Scope.Bigquery,BigqueryService.Scope.BigqueryInsertdata }
}.FromCertificate(_account.P12Key)
);
BigQueryService = new BigqueryService(new BaseClientService.Initializer
{
HttpClientInitializer = _credential,
ApplicationName = _applicationName,
});
}
In addition to Chris's answer, other things to note when using BigQuery from Container Engine:
Assuming your cluster has been initialized with the right scopes, you should be able to use Application Default Credentials from Container Engine, so you don't even need the p12 file. This is what we did for our Cloud Next 2017 demo - when developing locally I had a service account JSON file and set the GOOGLE_APPLICATION_CREDENTIALS
environment, and I didn't need to do anything when deploying in either Container Engine or AppEngine Flexible Environment. You can use GoogleCredential.GetApplicationDefaultAsync()
to retrieve the credentials. (It's safe to use .Result
on that if you're in a synchronous context.) That demo happened to be using Datastore by the time we were running it, but at other point it used BigQuery, so it definitely works.
Additionally, using BigQuery from .NET is now significantly simpler due to the BigQuery wrapper library I've been working on. It's only in beta at the moment, but I would expect it to simplify your code significantly - it deals with all the fiddly aspects of the JSON format for rows with record fields and the like.
See more on this question at Stackoverflow