Could not load file or assembly 'System.Web.Services' C# .netcore
Create Date: April 13, 2021 at 08:13 PM         | Tag: CSHARP         | Author Name: Sun, Charles |
How to prevent ReflectionTypeLoadException when calling Assembly.GetTypes()
One fairly nasty way would be:
Type[] types;
try
{
types = asm.GetTypes();
}
catch (ReflectionTypeLoadException e)
{
types = e.Types;
}
foreach (var t in types.Where(t => t != null))
{
...
}
It's definitely annoying to have to do this though. You could use an extension method to make it nicer in the "client" code:
public static IEnumerable<Type> GetLoadableTypes(this Assembly assembly)
{
// TODO: Argument validation
try
{
return assembly.GetTypes();
}
catch (ReflectionTypeLoadException e)
{
return e.Types.Where(t => t != null);
}
}
You may well wish to move the return
statement out of the catch block - I'm not terribly keen on it being there myself, but it probably is the shortest code...
Create nested app.config using visual studio
Create Date: March 30, 2021 at 07:50 PM         | Tag: CSHARP         | Author Name: Sun, Charles |
Visual Studio: differentiate app.config for debug and release mode
Unload the project in Solution Explorer via the context menu.
Edit the .csproj
file via the context menu and add this:
<PropertyGroup>
<AppConfig>App.$(Configuration).config</AppConfig>
</PropertyGroup>
I used the below way, and didn’t test the obove one yet:
<ItemGroup>
<None Include="App.config">
<SubType>Designer</SubType>
</None>
<None Include="App.Debug.config">
<DependentUpon>App.config</DependentUpon>
</None>
<None Include="App.QA.config">
<DependentUpon>App.config</DependentUpon>
</None>
<None Include="App.Release.config">
<DependentUpon>App.config</DependentUpon>
</None>
<None Include="App.Staging.config">
<DependentUpon>App.config</DependentUpon>
</None>
</ItemGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)' == 'QA|AnyCPU'">
<DebugSymbols>true</DebugSymbols>
<OutputPath>bin\QA\</OutputPath>
<DefineConstants>DEBUG;TRACE</DefineConstants>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
<DebugType>full</DebugType>
<PlatformTarget>AnyCPU</PlatformTarget>
<ErrorReport>prompt</ErrorReport>
<CodeAnalysisRuleSet>MinimumRecommendedRules.ruleset</CodeAnalysisRuleSet>
<Prefer32Bit>false</Prefer32Bit>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)' == 'QA|x64'">
<DebugSymbols>true</DebugSymbols>
<OutputPath>bin\x64\QA\</OutputPath>
<DefineConstants>DEBUG;TRACE</DefineConstants>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
<DebugType>full</DebugType>
<PlatformTarget>x64</PlatformTarget>
<ErrorReport>prompt</ErrorReport>
<CodeAnalysisRuleSet>MinimumRecommendedRules.ruleset</CodeAnalysisRuleSet>
<Prefer32Bit>true</Prefer32Bit>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)|$(Platform)' == 'QA|x86'">
<DebugSymbols>true</DebugSymbols>
<OutputPath>bin\x86\QA\</OutputPath>
<DefineConstants>DEBUG;TRACE</DefineConstants>
<TreatWarningsAsErrors>true</TreatWarningsAsErrors>
<DebugType>full</DebugType>
<PlatformTarget>x86</PlatformTarget>
<ErrorReport>prompt</ErrorReport>
<CodeAnalysisRuleSet>MinimumRecommendedRules.ruleset</CodeAnalysisRuleSet>
<Prefer32Bit>true</Prefer32Bit>
</PropertyGroup>
<Target Name="AfterBuild">
<Delete Files="$(TargetDir)$(TargetFileName).config" />
<Copy SourceFiles="$(ProjectDir)\App.$(Configuration).config" DestinationFiles="$(TargetDir)$(TargetFileName).config" />
</Target>
<Target Name="AfterBuild">
<Delete Files="$(TargetDir)$(TargetFileName).config" />
<Copy SourceFiles="$(ProjectDir)\App.$(Configuration).config" DestinationFiles="$(TargetDir)$(TargetFileName).config" />
</Target>
How to select different app.config for several build configurations
2. Fiddle with .proj file - copy-renames a whole new config file
Originally taken from here. It's a custom MSBuild task that you can embed into Visual Studio .proj file. Copy and paste the following code into the project file
<Target Name="AfterBuild">
<Delete Files="$(TargetDir)$(TargetFileName).config" />
<Copy SourceFiles="$(ProjectDir)\Config\App.$(Configuration).config"
DestinationFiles="$(TargetDir)$(TargetFileName).config" />
</Target>
Now create a folder in the project called Config
and add new files there: App.Debug.config, App.Release.config and so on. Now, depending on your configuration, Visual Studio will pick the config file from a Config
folder, and copy-rename it into the output directory. So if you had PatternPA.Test.Integration project and a Debug config selected, in the output folder after the build you will find a PatternPA.Test.Integration.dll.config file which was copied from Config\App.Debug.config
and renamed afterwards.
SqlBulkCopy Class (it is fast for bulk data load)
Create Date: March 30, 2021 at 03:35 PM         | Tag: CSHARP         | Author Name: Sun, Charles |
SQL Bulk copy method to insert large amount of data to the sql database
I was recently tasked with a project at a company to update an SQL Server 2008 database with large amounts of data each day. The task at first seemed daunting due to the files exceeding well over 400,000 records and there were several that needed processing daily. I first tried LINQ to SQL, but with the amount of data, the inserts were slow performing to say the least. Then I remembered the SqlBulkCopy
class. SqlBulkCopy
lets you efficiently bulk load a SQL Server table with data from another source. The SqlBulkCopy
class can be used to write data only to SQL Server tables. However, the data source is not limited to SQL Server; any data source can be used, as long as the data can be loaded to a DataTable
instance or read with a IDataReader
instance. For this example the file will contain roughly 1000 records, but this code can handle large amounts of data.
To begin with let’s create a table in SQL Server that will hold the data. Copy the following T-SQL into SQL Server to create your table:
Copy Code
CREATE TABLE [dbo].[Censis]( [Suburb] [varchar](200) NULL, [NotStated] [int] NULL, [NotApplicable] [int] NULL, [Fishing] [int] NULL, [Mining] [int] NULL, [Manufacturing] [int] NULL, [Electricity] [int] NULL, [Construction] [int] NULL ) ON [PRIMARY] GO
The table above will hold Census data that is freely available to download in Australia.
The next item to do is create a console application that will bulk load the data. Open Visual Studio 2008 and choose File > New > Windows > Console Application.
Before moving on, to explain the code I have to work backwards and explain the final method that bulk loads data. SqlBulkCopy
has a method called WriteToServer
. One of the overloads of this method takes a DataTable
as the parameter. Because a DataTable
contains rows and columns, this seemed like a logical choice for the task I was facing.
Jumping back to the example we now know we need to create a DataTable
that contains the information from the text file. The code below demonstrates how to do this:
Copy Code
DataTable dt = new DataTable(); string line = null; int i = 0; using (StreamReader sr = File.OpenText(@"c:\temp\table1.csv")) { while ((line = sr.ReadLine()) != null) { string[] data = line.Split(','); if (data.Length > 0) { if (i == 0) { foreach (var item in data) { dt.Columns.Add(new DataColumn()); } i++; } DataRow row = dt.NewRow(); row.ItemArray = data; dt.Rows.Add(row); } } }
VB.NET
Copy Code
Dim dt As New DataTable() Dim line As String = Nothing Dim i As Integer = 0 Using sr As StreamReader = File.OpenText("c:\temp\table1.csv") line = sr.ReadLine() Do While line IsNot Nothing Dim data() As String = line.Split(","c) If data.Length > 0 Then If i = 0 Then For Each item In data dt.Columns.Add(New DataColumn()) Next item i += 1 End If Dim row As DataRow = dt.NewRow() row.ItemArray = data dt.Rows.Add(row) End If line = sr.ReadLine() Loop End Using
In the code above, I created a DataTable
that will store all the information from the csv file. The CSV file resides in the C:\Temp directory. I am using a StreamReader
object to open the file and read each line in the file. Each line is then split up into a string array. That string array will be assigned to each DataRow
as the ItemArray
value. This sets the values for the row through the array.
When the file has been read, the next thing to do is use the SqlBulkCopy
class to insert the data into SQL Server. The following code demonstrates how to do this:
Copy Code
using (SqlConnection cn = new SqlConnection(ConfigurationManager.ConnectionStrings[ "ConsoleApplication3.Properties.Settings.daasConnectionString"].ConnectionString)) { cn.Open(); using (SqlBulkCopy copy = new SqlBulkCopy(cn)) { copy.ColumnMappings.Add(0, 0); copy.ColumnMappings.Add(1, 1); copy.ColumnMappings.Add(2, 2); copy.ColumnMappings.Add(3, 3); copy.ColumnMappings.Add(4, 4); copy.DestinationTableName = "Censis"; copy.WriteToServer(dt); } }
VB.NET
Copy Code
Using cn As New SqlConnection(ConfigurationManager.ConnectionStrings(_ "ConsoleApplication3.Properties.Settings.daasConnectionString").ConnectionString) cn.Open() Using copy As New SqlBulkCopy(cn) copy.ColumnMappings.Add(0, 0) copy.ColumnMappings.Add(1, 1) copy.ColumnMappings.Add(2, 2) copy.ColumnMappings.Add(3, 3) copy.ColumnMappings.Add(4, 4) copy.DestinationTableName = "Censis" copy.WriteToServer(dt) End Using End Using
SqlBulkCopy
uses ADO.NET to connect to a database to bulk load the data. I have created an SqlConnection
object, and that object reference is used to create the SqlBulkCopy
object. The DestinationTableName
property references a table in the database where the data is to be loaded. A handy feature of SqlBulkCopy
is the SqlBulkCopyColumnMappingCollection
. Column mappings define the relationships between columns in the data source and columns in the destination. This is handy if the data source file has columns that don’t need to be inserted into the database. Column mappings can be set by an index, such as the example above, or they can be set by the name of the column. Using the index is handy when you’re working with files that contain no column names. Make sure both of your datatable and sqltable columns are in the same order. Finally the data is sent to the database by running the WriteToServer
method.
Hence using sqibulkcopy()
method is a very fastest than any other insertion method.
As you are asking for less than 10 records,I would suggest you that use simple insert query.
But if you want to use SQLBulkCopy then first you should know when to use it.
BULK INSERT
The BULK INSERT command is the in-process method for bringing data from a text file into SQL Server. Because it runs in process with Sqlservr.exe, it is a very fast way to load data files into SQL Server.
Is SqlBulkCopy still faster than Dapper?
Disclaimer: I'm the owner of the project Dapper Plus
Dapper Plus for SQL Server/Azure uses SqlBulkCopy under the hood when there are enough entities to save otherwise it will use a SQL derived table.
This article is about Entity Framework, but it's the same strategy for Dapper if you want more information: Entity Framework How to Bulk Insert in SQL Server
So, our library obviously does not outperform SqlBulkCopy, it's the same performance, but our library makes it easier to use.
The library also support:
- BulkUpdate
- BulkDelete
- BulkMerge
using SqlBulkCopy and Temporary Table tricks.
Is there a faster way to use SqlBulkCopy than using a DataTable?
The delay is caused because you have to buffer everything into a DataTable before sending it to the server. To get better performance you should send the records to SqlBulkCopy immediatelly, and let the class use its own buffering and batching.
SqlBulkCopy can work with an IDataReader. All ADO.NET data readers implement this interface, so you can push data that you read from any data reader to SqlBulkCopy.
In other cases, assuming you have an IEnumerable of your objects, you can use Marc Gravel's ObjectReader from the FastMember package to create an IDataReader on top of the IEnumerable. This data reader does not load everything at once, so no data is cached until SqlBulkCopy asks for it :
Copying Marc Gravel's example:
IEnumerable<SomeType> data = ...
using(var bcp = new SqlBulkCopy(connection))
using(var reader = ObjectReader.Create(data, "Id", "Name", "Description"))
{
bcp.DestinationTableName = "SomeTable";
bcp.WriteToServer(reader);
}
By test, BULK INSERT is much faster. After an hour using SQLBulkCopy, I was maybe a quarter of the way through my data, and I had finished writing the alternative method (and having lunch). By the time I finished writing this post (~3 minutes), BULK INSERT was about a third of the way through.
For anyone who is looking at this as a reference, it is also worth mentioning that the upload is faster without a primary key.
It should be noted that one of the major causes for this could be that the server was a significantly more powerful computer, and that this is not an analysis of the efficiency of the algorithm, however I would still recommend using BULK INSERT, as the average server is probably significantly faster than the average desktop computer.
New Comment