Newtonsoft.JSON使用GUID序列化\反序列化数据表

时间:2017-03-30 15:19:43

标签: json serialization asp.net-web-api json.net

我有一个Web API解决方案,我用它来处理各种数据库。一切正常,直到我访问了一个带有GUID号的表。我最终将问题缩小到Serialize / Deserialize方法。当DataTable被序列化为JSON然后随后反序列化回DataTable时,包含GUID的列的数据类型变为System.String而不是最初的System.Guid

举个简单的例子:

System.Data.DataTable myData = new DataTable();
myData.Columns.Add("ID", typeof(System.Guid));
myData.Columns.Add("Name", typeof(System.String));
myData.Rows.Add(System.Guid.NewGuid(), "Name1");

// Reports: System.Guid
System.Diagnostics.Debug.Write(myData.Columns["ID"].DataType.FullName);

string mySerializedString = Newtonsoft.Json.JsonConvert.SerializeObject(myData);
myData = Newtonsoft.Json.JsonConvert.DeserializeObject<System.Data.DataTable>(mySerializedString);

// Reports: System.String
System.Diagnostics.Debug.Write(myData.Columns["ID"].DataType.FullName);

这是一个泛型 API,所以我不知道传递的表是否包含包含GUID的列或任何可能导致JSON问题的其他数据类型。

希望它只是因为我错过了一些愚蠢的东西,这只是一个简单的修复?

****更新****

感谢所有提示。与此同时,我遇到了另一个可能只是结束的问题。

当您最初从数据库加载表时,每个rowstate =不变。但是,通过序列化过程,结果是每个行状态变为&#34;添加&#34;。

因此,例如我查询1条记录,修改该记录并将其发回。 DataTable现在已将该记录标记为&#34;已添加&#34;然后打破主键约束并且它会爆炸。

这很糟糕!

我基本上都在寻找两端的某种巨大的kludge工作,我必须通过JSON流程传递我发现的所有属性的字典,然后在另一端重建所有属性。

所有这一切都在尝试通用而不知道我应该处理什么样的数据。

&amp; *(^%it!

3 个答案:

答案 0 :(得分:1)

JSON中没有办法表明某些东西是GUID 来自json.org

  

值可以是双引号中的字符串,也可以是数字,或者是真或   false或null,或对象或数组。这些结构可以   嵌套。

因为你没有反序列化回定义类型的东西,所以反序列化器无法知道它应该尝试将“GUID as a string”转换回实际的GUID。

如果要反序列化到具有GUID类型的属性的对象,它将尝试将该字符串解析为该属性的GUID,但数据表的结构不是传输的JSON的一部分。 / p>

答案 1 :(得分:1)

正如@Craig H指出的那样,JSON有一个非常有限的类型系统 - 它可以区分数字,布尔值,字符串,对象和数组,以及null值。如果您需要更具体的内容,则必须使用类型信息作为元数据嵌入JSON本身。

Json.Net有一个TypeNameHandling设置,可用于将其写入类型信息到JSON中。不幸的是,Json.Net附带的DataTableConverter似乎并不尊重这种设置。要解决此问题,您可以对DataTables使用自定义JsonConverter,例如How to include column metadata in JSON for an empty DataTable

中的DataTables

像这样使用:

JsonSerializerSettings settings = new JsonSerializerSettings();
settings.Converters.Add(new CustomDataTableConverter());

string mySerializedString = JsonConvert.SerializeObject(myData, settings);
myData = JsonConvert.DeserializeObject<DataTable>(mySerializedString, settings);

以下示例代码为https://dotnetfiddle.net/wXNy9o

答案 2 :(得分:0)

Thanks everyone for the tips. In dealing with JSON and .NET Datatables, I have had to overcome numerous different challenges.

  1. Tables that return 0 rows, don't serialize so you end up with a Datatable with 0 columns.
  2. JSON loses the DataType of the Columns
  3. JSON loses the RowState of the records.
  4. JSON loses the Original\Proposed values of a Modified Row so it won't save
  5. JSON converts the DataType Byte[] to a Base64 string, but doesn't convert it back.

As a result, I have found a solution that seems to overcome all of these in one shot. To Summarize the solution..

  • Use the Built in XML feature of the DataTable to restore the Schema
  • Save off the Original RowState in order to restore it back
  • Save off the Original Values for Modified records to restore it back.

Since we all like code...

The Class that is going to be serialized (there are more properties but these are the focus)

    public System.Data.DataTable Data { get; set; }

    [Newtonsoft.Json.JsonProperty]
    private string _DataTableSchema { get; set; }

    [Newtonsoft.Json.JsonProperty]
    private List<DeconstructedDataRow> _DeconstructedDataRows { get; set; }

    private class DeconstructedDataRow
    {
        public System.Data.DataRowState RowState { get; set; }
        public List<object> RowOriginalValues { get; set; }

        public DeconstructedDataRow()
        {
            RowState = System.Data.DataRowState.Unchanged;
            RowOriginalValues = new List<object>();
        }
    }

From the one side, BEFORE Serializing the table you call .Data_Deconstruct....

    public void Data_Deconstruct()
    {  
        //Couple of Pre-Checks
        _DataTableSchema = String.Empty;
        _DeconstructedDataRows.Clear(); 
        if (this.Data == null || this.Data.Columns.Count == 0)
        {
            return;
        }

        //We need to mess around a bit so instead of tampering with their original table, we work with a copy
        System.Data.DataTable myWorkingData = this.Data.Copy();


        //In order to serialize to XML, the table MUST have a name
        if (String.IsNullOrWhiteSpace(myWorkingData.TableName) == true)
        {
            myWorkingData.TableName = System.Guid.NewGuid().ToString();
        }

        //JSON doesn't carry over the Schema of the Table so we loose the specific 
        //DataTypes of each colum. So we use the Built-in method of the DataTable object to create 
        //a XML\String version of its schema 
        System.IO.StringWriter myStringWriter = new System.IO.StringWriter();
        myWorkingData.WriteXmlSchema(myStringWriter, true);
        _DataTableSchema = myStringWriter.ToString();


        //JSON and the process of Serializing and Deserializing doesn't carry over 
        //the proper RowState for each Record. In addition, for those records that
        //have been Modified, we lose the Original values, and for those records that 
        //have been Deleted, we can't serialize them. So this is a KLUDGE king that
        //seems to sort all this out.
        for (Int32 intRowIndex = 0; intRowIndex < myWorkingData.Rows.Count; intRowIndex++)
        {
            DeconstructedDataRow myDeconstructedDataRow = new DeconstructedDataRow();

            //So start by saving off the current RowState
            myDeconstructedDataRow.RowState = myWorkingData.Rows[intRowIndex].RowState;

            //If the RowState is DELETED, then the ORGINAL Record will not serialize, 
            //so we need to reject the ORIGINAL state for now and we will restore it later
            if (myDeconstructedDataRow.RowState == System.Data.DataRowState.Deleted)
            {
                this.Data.Rows[intRowIndex].RejectChanges();
            }

            //If the RowState is MODIFIED, then we have to restore the ORIGINAL values
            //when we restore this record. Without the Original Values, the record won't 
            //update and even if we force it, it will error out because of 'concurrency' errors.
            if (myDeconstructedDataRow.RowState == System.Data.DataRowState.Modified)
            {
                myWorkingData.Rows[intRowIndex].RejectChanges();      
                myDeconstructedDataRow.RowOriginalValues.AddRange(myWorkingData.Rows[intRowIndex].ItemArray);
            }

            //And don't forget to add it to our list
            this._DeconstructedDataRows.Add(myDeconstructedDataRow);
        }

        //Clean up our Clone.
        myWorkingData.Dispose();
        myWorkingData = null;
    }

From the other side, AFTER Deserialized you call .Data_Reconstruct....

    public void Data_Reconstruct()
    {
        //Couple of Pre-Checks
        if (this.Data == null || String.IsNullOrWhiteSpace(_DataTableSchema) == true)
        {
            return;
        }


        //So first we build a new DataTable with the correct Schema
        System.Data.DataTable myWorkingData = new System.Data.DataTable();
        System.IO.StringReader myStringReader = new System.IO.StringReader(_DataTableSchema);
        myWorkingData.ReadXmlSchema(myStringReader);


        //Now we transfer over all the data that was serialize 'as-is' from the existing to the new Table
        foreach (System.Data.DataRow myRow in this.Data.Rows)
        {
            //myWorkingData.ImportRow(myRow);  //Should have been this easy BUT ...

            // JSON converts some data types to a different format, but then when it deserializes
            // it doesn't convert them back (not sure why). So we have to account for that
            // and at a performance cost
            System.Data.DataRow myNewRecord = myWorkingData.NewRow();  //Create a New row from the table with the Proper Schema
            foreach (System.Data.DataColumn myField in myWorkingData.Columns)
            {
                if (myField.DataType.Equals(typeof(System.Byte[])))
                {
                    myNewRecord[myField.ColumnName] = Convert.FromBase64String(Convert.ToString(myRow[myField.ColumnName]));
                }
                else
                {
                    myNewRecord[myField.ColumnName] = myRow[myField.ColumnName];
                }
            }
            myWorkingData.Rows.Add(myNewRecord);

        }

        //We have to accept the changes because all rows are currently marked as "Added" (via JSON as well)
        myWorkingData.AcceptChanges();


        //Now restore their Row States 
        for (Int32 intRowIndex = 0; intRowIndex < myWorkingData.Rows.Count; intRowIndex++)
        {
            switch (_DeconstructedDataRows[intRowIndex].RowState)
            {
                case System.Data.DataRowState.Added:
                    myWorkingData.Rows[intRowIndex].SetAdded();
                    break;

                case System.Data.DataRowState.Deleted:
                    myWorkingData.Rows[intRowIndex].Delete();
                    break;

                case System.Data.DataRowState.Modified:
                    //For Modified, we have to do some kludge stuff or else the UPDATE will not trigger
                    //We start by saving off the Values that are in the Record NOW (aka the New values)
                    object[] objNewValues = myWorkingData.Rows[intRowIndex].ItemArray;

                    //Now we replace those values with the ORIGINAL values we saved off before transporting
                    for (Int32 intFieldIndex = 0; intFieldIndex < this._DeconstructedDataRows[intRowIndex].RowOriginalValues.Count; intFieldIndex++)
                    {
                        if (this._DeconstructedDataRows[intRowIndex].RowOriginalValues[intFieldIndex] == null)
                        {
                            this._DeconstructedDataRows[intRowIndex].RowOriginalValues[intFieldIndex] = DBNull.Value;
                        }
                    }
                    myWorkingData.Rows[intRowIndex].ItemArray = this._DeconstructedDataRows[intRowIndex].RowOriginalValues.ToArray();
                    myWorkingData.Rows[intRowIndex].AcceptChanges();

                    //and Last we replace those Original values with the New Values, which not only
                    //correctly sets the Original\Proposed values, but also changes the RowState to MODIFIED
                    myWorkingData.Rows[intRowIndex].ItemArray = objNewValues; 
                    break;

                default:
                    //These would be the Unchanged
                    break;
            }
        }

        //And finally, we replace the existing Table with our fixed one.
        this.Data = myWorkingData;

    }

To test this I created 2 records in my database

Rec1_Delete Rec2_Modify

Then I loaded the table using the methods above, flagged the Delete Record as Delete, Modified a field(s) in the Modify records, and Added a brand new record.

I then sent the datatable back to the API and in the database, the Deleted record got deleted, the modified record got modified, and the new record was added.

As a side note, I expose the actual DataTable, because if you happen to call this from something other than .NET (like Javascript in an HTML file). you can still use the DataTable directly to read and display data. The whole DeConstruct and ReConstruct was for .NET clients while still allowing for limited access for those others.

相关问题