字符串值不正确:列的'\ xEF \ xBF \ xBD'

时间:2012-06-22 15:16:45

标签: c# mysql

我有一张桌子需要处理各种角色。字符包括Ø,®等。

我已将我的表设置为utf-8作为默认排序规则,所有列都使用表默认值,但是当我尝试插入这些字符时,我收到错误:字符串值不正确:'\ xEF \ xBF \ xBD'列' buyerName'在第1行

我的连接字符串定义为

string mySqlConn = "server="+server+";user="+username+";database="+database+";port="+port+";password="+password+";charset=utf8;";

我不知道为什么我仍然会看到错误。我是否错过了.net连接器或我的MySQL设置?

- 编辑 -

我的(新)C#插入语句如下所示:

MySqlCommand insert = new MySqlCommand( "INSERT INTO fulfilled_Shipments_Data " +
     "(amazonOrderId,merchantOrderId,shipmentId,shipmentItemId,"+
     "amazonOrderItemId,merchantOrderItemId,purchaseDate,"+ ...

      VALUES (@amazonOrderId,@merchantOrderId,@shipmentId,@shipmentItemId,"+
      "@amazonOrderItemId,@merchantOrderItemId,@purchaseDate,"+ 
      "paymentsDate,shipmentDate,reportingDate,buyerEmail,buyerName,"+ ...


       insert.Parameters.AddWithValue("@amazonorderId",lines[0]);
       insert.Parameters.AddWithValue("@merchantOrderId",lines[1]); 
       insert.Parameters.AddWithValue("@shipmentId",lines[2]);
       insert.Parameters.AddWithValue("@shipmentItemId",lines[3]);
       insert.Parameters.AddWithValue("@amazonOrderItemId",lines[4]);
       insert.Parameters.AddWithValue("@merchantOrderItemId",lines[5]);
       insert.Parameters.AddWithValue("@purchaseDate",lines[6]);
       insert.Parameters.AddWithValue("@paymentsDate",lines[7]);

 insert.ExecuteNonQuery();

假设这是使用参数化语句的正确方法,它仍然会给出错误

 "Incorrect string value: '\xEF\xBF\xBD' for column 'buyerName' at row 1"

还有其他想法吗?

4 个答案:

答案 0 :(得分:18)

\xEF\xBF\xBD是unicode字符U+FFFD的UTF-8编码。这是一个特殊的角色,也被称为“替换角色”。来自the wikipedia page about the special unicode characters的引用:

  

替换字符 (通常是带有白色问号的黑色菱形)是在Specials表中的代码点U + FFFD的Unicode标准中找到的符号。当系统无法将数据流解码为正确的符号时,它用于指示问题。当字体不包含字符时最常见,但是当数据无效且与任何字符都不匹配时也会看到它:

因此,您的数据源似乎包含损坏的数据。您也可能尝试使用错误的编码读取数据。这些线来自哪里?

如果您无法修复数据,并且您的输入确实包含无效字符,则可以删除替换字符:

lines[n] = lines[n].Replace("\xFFFD", "");

答案 1 :(得分:3)

永远,永远,永远创建这样的SQL语句。这对SQL注入是开放的。

我正在添加这个作为答案,因为这是一个根本性的错误,你可能需要重写你的大部分程序。

这不是你如何为SQL语句提供参数,并且不值得任何人回答你的问题,因为你应该使用参数化查询,这可能也会解决你的问题。

答案 2 :(得分:2)

Mattmanser是对的,永远不会通过在查询中直接连接参数来编写SQL查询。参数化查询的一个示例是:

string lastname = "Doe";
double height = 6.1;
DateTime date = new DateTime(1978,4,18);

var connection = new MySqlConnection(connStr);

try
{
    connection.Open();

    var command = new MySqlCommand(
        "SELECT * FROM tblPerson WHERE LastName = @Name AND Height > @Height AND BirthDate < @BirthDate", connection);

    command.Parameters.AddWithValue("@Name", lastname);
    command.Parameters.AddWithValue("@Height", height);
    command.Parameters.AddWithValue("@Name", birthDate);

    MySqlDataReader reader = command.ExecuteReader();
    ...
}
finally
{
    connection.Close();
}

答案 3 :(得分:0)

对于那些使用PHP有类似问题的人,请尝试使用函数scala> x.show() 15/08/25 06:55:33 INFO MemoryStore: ensureFreeSpace(283428) called with curMem=2248728, maxMem=278302556 15/08/25 06:55:33 INFO MemoryStore: Block broadcast_48 stored as values in memory (estimated size 276.8 KB, free 263.0 MB) 15/08/25 06:55:33 INFO MemoryStore: ensureFreeSpace(22390) called with curMem=2532156, maxMem=278302556 15/08/25 06:55:33 INFO MemoryStore: Block broadcast_48_piece0 stored as bytes in memory (estimated size 21.9 KB, free 263.0 MB) 15/08/25 06:55:33 INFO BlockManagerInfo: Added broadcast_48_piece0 in memory on 192.168.72.167:60712 (size: 21.9 KB, free: 265.2 MB) 15/08/25 06:55:33 INFO BlockManagerMaster: Updated info of block broadcast_48_piece0 15/08/25 06:55:33 INFO SparkContext: Created broadcast 48 from hadoopFile at AvroRelation.scala:75 15/08/25 06:55:34 INFO FileInputFormat: Total input paths to process : 1 15/08/25 06:55:34 INFO SparkContext: Starting job: RangePartitioner at Exchange.scala:88 15/08/25 06:55:34 INFO DAGScheduler: Got job 32 (RangePartitioner at Exchange.scala:88) with 2 output partitions (allowLocal=false) 15/08/25 06:55:34 INFO DAGScheduler: Final stage: Stage 52(RangePartitioner at Exchange.scala:88) 15/08/25 06:55:34 INFO DAGScheduler: Parents of final stage: List() 15/08/25 06:55:34 INFO DAGScheduler: Missing parents: List() 15/08/25 06:55:34 INFO DAGScheduler: Submitting Stage 52 (MapPartitionsRDD[100] at RangePartitioner at Exchange.scala:88), which has no missing parents 15/08/25 06:55:34 INFO MemoryStore: ensureFreeSpace(4040) called with curMem=2554546, maxMem=278302556 15/08/25 06:55:34 INFO MemoryStore: Block broadcast_49 stored as values in memory (estimated size 3.9 KB, free 263.0 MB) 15/08/25 06:55:34 INFO MemoryStore: ensureFreeSpace(2243) called with curMem=2558586, maxMem=278302556 15/08/25 06:55:34 INFO MemoryStore: Block broadcast_49_piece0 stored as bytes in memory (estimated size 2.2 KB, free 263.0 MB) 15/08/25 06:55:34 INFO BlockManagerInfo: Added broadcast_49_piece0 in memory on 192.168.72.167:60712 (size: 2.2 KB, free: 265.2 MB) 15/08/25 06:55:34 INFO BlockManagerMaster: Updated info of block broadcast_49_piece0 15/08/25 06:55:34 INFO SparkContext: Created broadcast 49 from broadcast at DAGScheduler.scala:839 15/08/25 06:55:34 INFO DAGScheduler: Submitting 2 missing tasks from Stage 52 (MapPartitionsRDD[100] at RangePartitioner at Exchange.scala:88) 15/08/25 06:55:34 INFO YarnScheduler: Adding task set 52.0 with 2 tasks 15/08/25 06:55:34 INFO TaskSetManager: Starting task 0.0 in stage 52.0 (TID 840, quickstart.cloudera, NODE_LOCAL, 1425 bytes) 15/08/25 06:55:34 INFO BlockManagerInfo: Added broadcast_49_piece0 in memory on quickstart.cloudera:61000 (size: 2.2 KB, free: 530.1 MB) 15/08/25 06:55:34 INFO BlockManagerInfo: Added broadcast_48_piece0 in memory on quickstart.cloudera:61000 (size: 21.9 KB, free: 530.1 MB) 15/08/25 06:55:34 INFO TaskSetManager: Starting task 1.0 in stage 52.0 (TID 841, quickstart.cloudera, NODE_LOCAL, 1425 bytes) 15/08/25 06:55:34 INFO TaskSetManager: Finished task 0.0 in stage 52.0 (TID 840) in 95 ms on quickstart.cloudera (1/2) 15/08/25 06:55:34 INFO TaskSetManager: Finished task 1.0 in stage 52.0 (TID 841) in 22 ms on quickstart.cloudera (2/2) 15/08/25 06:55:34 INFO DAGScheduler: Stage 52 (RangePartitioner at Exchange.scala:88) finished in 0.117 s 15/08/25 06:55:34 INFO YarnScheduler: Removed TaskSet 52.0, whose tasks have all completed, from pool 15/08/25 06:55:34 INFO DAGScheduler: Job 32 finished: RangePartitioner at Exchange.scala:88, took 0.132854 s org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:166) at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:158) at org.apache.spark.SparkContext.clean(SparkContext.scala:1628) at org.apache.spark.rdd.RDD.mapPartitions(RDD.scala:635) at org.apache.spark.sql.execution.Project.execute(basicOperators.scala:40) at org.apache.spark.sql.execution.SparkPlan.executeTake(SparkPlan.scala:96) at org.apache.spark.sql.execution.Limit.executeCollect(basicOperators.scala:103) at org.apache.spark.sql.DataFrame.collect(DataFrame.scala:815) at org.apache.spark.sql.DataFrame.head(DataFrame.scala:758) at org.apache.spark.sql.DataFrame.take(DataFrame.scala:809) at org.apache.spark.sql.DataFrame.showString(DataFrame.scala:178) at org.apache.spark.sql.DataFrame.show(DataFrame.scala:314) at org.apache.spark.sql.DataFrame.show(DataFrame.scala:320) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:48) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:53) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:55) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:57) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:59) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:61) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:63) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:65) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:67) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:69) at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:71) at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:73) at $iwC$$iwC$$iwC$$iwC.<init>(<console>:75) at $iwC$$iwC$$iwC.<init>(<console>:77) at $iwC$$iwC.<init>(<console>:79) at $iwC.<init>(<console>:81) at <init>(<console>:83) at .<init>(<console>:87) at .<clinit>(<console>) at .<init>(<console>:7) at .<clinit>(<console>) at $print(<console>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856) at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813) at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656) at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.serializer.SerializationDebugger$ObjectStreamClassMethods$.getObjFieldValues$extension(SerializationDebugger.scala:240) at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:150) at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:99) at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:158) at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:99) at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:158) at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:99) at org.apache.spark.serializer.SerializationDebugger$.find(SerializationDebugger.scala:58) at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:39) at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:47) at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:80) at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:164) ... 66 more Caused by: java.lang.ArrayIndexOutOfBoundsException 。它只是有效!