如何使用rich mediator使用WSO2 ESB将SOAP信封设置为XML主体

时间:2017-07-30 09:03:26

标签: wso2 wso2esb

我需要使用rich mediator将soap envelope设置为xml body。 我在有效负载工厂介体中创建了XML主体。我得到了身体的财产价值。 属性

<property expression="$body/*[1]" name="INPUT_MESSAGE" scope="default" type="OM" xmlns:ns="http://org.apache.synapse/xsd"/>

身体就是这样

<bas:setMOAttributes xmlns:bas="http://www.3gpp.org/ftp/Specs/archive/32_series/32607/schema/32607-700/BasicCMIRPData">
<queryXpathExp>
    <soap:baseObjectInstance xmlns:soap="http:   //Here is only few lines

现在我需要添加soap envelop.I在INPUT_MESSAGE属性之后使用了rich mediator

&#13;
&#13;
<enrich>
        <source clone="true" type="inline">
            <soapenv:Envelope xmlns:bas="http://www.3gpp.org/ftp/Specs/archive/32_series/32607/schema/32607-700/BasicCMIRPData" xmlns:soap="http://www.alcatel-lucent.com/soap_cm" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/">
                <soapenv:Header/>
                <soapenv:Body>
                    <list xmlns=""/>
                </soapenv:Body>
            </soapenv:Envelope>
        </source>
        <target type="envelope"/>
    </enrich>
    <enrich>
        <source clone="true" property="INPUT_MESSAGE" type="property"/>
        <target type="body"/>
    </enrich>
&#13;
&#13;
&#13;

但我无法通过肥皂包裹获得xml身体。这是怎么做的

3 个答案:

答案 0 :(得分:3)

试试这个会起作用

请求像这样的身体

<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/">
   <soapenv:Header/>
   <soapenv:Body>
      <list><![CDATA[<bas:setMOAttributes xmlns:bas="http://www.3gpp.org/ftp/Specs/archive/32_series/32607/schema/32607-700/BasicCMIRPData">
       <queryXpathExp>
         <soap:baseObjectInstance xmlns:soap="http://www.alcatel-lucent.com/soap_cm">
          hello
         </soap:baseObjectInstance>
       </queryXpathExp>
     </bas:setMOAttributes>]]></list>
   </soapenv:Body>
</soapenv:Envelope>

使用像这样的Enrich中介代理

 I/OpenCVManager/Helper:     C++ Compiler:                /usr/local/bin/ccache /opt/android/android-ndk-r10e/toolchains/arm-linux-
    08-29 12:10:16.818 6248-6248/com.gurankas.bhism.documentscanner I/OpenCVManager/Helper:   Python 3:
    08-29 12:10:16.818 6248-6248/com.gurankas.bhism.documentscanner I/OpenCVManager/Helper:     Interpreter:                 NOissing-declarations -Wmissing-prototypes -Wstrict-prototypes -Wundef -Winit-self -Wpointer-arith -Wshadow -Wno-narrowing -fdiagnostics-show-option -fomit-frame-pointer -mfpu=neon -fvisibility=hidden -mthumb -fomit-frame-pointer -fno-strict-aliasing -O3 -DNDEBUG  -DNDEBUG
    08-29 12:1
    08-29 12:10:16.818 6248-6248/com.gurankas.bhism.documentscanner I/OpenCVManager/Helper:   Python (for build):            /opt/pythonenv/build/bin/python2.7
    08-29 12:10:16.818 6248-6248/com.gurankas.bhism.documentscanner I/OpenCVManager/Helper:   Java:
    08-29 12:10:16.818 6248-6248/com.gurankas.bhism.documentscanner I/OpenCVManager/Helper:     ant:                         /usr/local/bin/ant (ver 1.9.4)
    08-29 12:10:16.818 6248-6248/com.gurankas.bhism.documentscanner I/OpenCVManager/Helper:     Java wrappers:               YES
    08-29 12:10:16.818 6248-6248/com.gurankas.bhism.documentscanner I/OpenCVManager/Helper:     Java tests:                  YES
    08-29 12:10:16.818 6248-6248/com.gurankas.bhism.documentscanner I/OpenCVManager/Helper:   Matlab:                        NO
    08-29 12:10:16.818 6248-6248/com.gurankas.bhism.documentscanner I/OpenCVManager/Helper:   Tests and samples:
    08-29 12:10:16.818 6248-6248/com.gurankas.bhism.documentscanner I/OpenCVManager/Helper:     Tests:                       YES
    08-29 12:10:16.818 6248-6248/com.gurankas.bhism.documentscanner I/OpenCVManager/Helper:     Performance tests:           NO
    08-29 12:10:16.818 6248-6248/com.gurankas.bhism.documentscanner I/OpenCVManager/Helper:     C/C++ Examples:              NO
    08-29 12:10:16.818 6248-6248/com.gurankas.bhism.documentscanner I/OpenCVManager/Helper:   Install path:                  /Volumes/Linux/builds/master_pack-android/build/o4a/install
    08-29 12:10:16.818 6248-6248/com.gurankas.bhism.documentscanner I/OpenCVManager/Helper:   cvconfig.h is in:              /Volumes/Linux/builds/master_pack-android/build/o4a
    08-29 12:10:16.818 6248-6248/com.gurankas.bhism.documentscanner I/OpenCVManager/Helper: -----------------------------------------------------------------
    08-29 12:10:16.819 6248-6248/com.gurankas.bhism.documentscanner D/OpenCVManager/Helper: Init finished with status 0
    08-29 12:10:16.819 6248-6248/com.gurankas.bhism.documentscanner D/OpenCVManager/Helper: Unbind from service
    08-29 12:10:16.824 6248-6248/com.gurankas.bhism.documentscanner D/OpenCVManager/Helper: Calling using callback
    08-29 12:10:16.824 6248-6248/com.gurankas.bhism.documentscanner I/LensActivity: OpenCV loaded successfully
    08-29 12:10:16.882 6248-6408/com.gurankas.bhism.documentscanner I/Adreno-EGL: <qeglDrvAPI_eglInitialize:410>: EGL 1.4 QUALCOMM build: AU_LINUX_ANDROID_LA.BF.1.1.1_RB1.05.00.02.042.016_msm8610_LA.BF.1.1.1_RB1__release_AU ()
                                                                                  OpenGL ES Shader Compiler Version: E031.25.03.00
                                                                                  Build Date: 02/11/15 Wed
                                                                                  Local Branch: 
                                                                                  Remote Branch: quic/LA.BF.1.1.1_rb1.10
                                                                                  Local Patches: NONE
                                                                                  Reconstruct Branch: AU_LINUX_ANDROID_LA.BF.1.1.1_RB1.05.00.02.042.016 + 62ca4eb + acd831d + 9f8b442 + e027a02 + cba30ba + 53c303a + a649d79 + 23e16f8 + 5e97da7 + cbd2a44 + 33d072a + 7aacf06 + 72b33e7 + 28f6f60 + b4c13d8 +  NOTHING
    08-29 12:10:16.884 6248-6408/com.gurankas.bhism.documentscanner I/OpenGLRenderer: Initialized EGL, version 1.4
    08-29 12:10:16.904 6248-6408/com.gurankas.bhism.documentscanner D/OpenGLRenderer: Enabling debug mode 0
    08-29 12:10:17.049 6248-6248/com.gurankas.bhism.documentscanner W/art: Before Android 4.1, method int android.support.v7.widget.ListViewCompat.lookForSelectablePosition(int, boolean) would have incorrectly overridden the package-private method in android.widget.ListView
    08-29 12:10:24.285 6248-6248/com.gurankas.bhism.documentscanner D/LensActivity: File URI = file:///storage/emulated/0/Document%20Scanner/image_10.jpg
    08-29 12:10:33.369 6248-6248/com.gurankas.bhism.documentscanner D/LensActivity: 1 1 -1 -1
    08-29 12:10:33.369 6248-6248/com.gurankas.bhism.documentscanner D/LensActivity: file:///storage/emulated/0/Document%20Scanner/image_10.jpg
    08-29 12:10:33.434 6248-6262/com.gurankas.bhism.documentscanner W/art: Suspending all threads took: 12.494ms
    08-29 12:10:35.931 6248-6262/com.gurankas.bhism.documentscanner W/art: Suspending all threads took: 7.088ms
    08-29 12:10:47.779 6248-7428/com.gurankas.bhism.documentscanner E/art: No implementation found for void org.opencv.imgproc.Imgproc.Canny_3(long, long, double, double) (tried Java_org_opencv_imgproc_Imgproc_Canny_13 and Java_org_opencv_imgproc_Imgproc_Canny_13__JJDD)
    08-29 12:10:47.792 6248-7428/com.gurankas.bhism.documentscanner E/AndroidRuntime: FATAL EXCEPTION: AsyncTask #1
                                                                                      Process: com.gurankas.bhism.documentscanner, PID: 6248
                                                                                      java.lang.RuntimeException: An error occured while executing doInBackground()
                                                                                          at android.os.AsyncTask$3.done(AsyncTask.java:304)
                                                                                          at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:355)
                                                                                          at java.util.concurrent.FutureTask.setException(FutureTask.java:222)
                                                                                          at java.util.concurrent.FutureTask.run(FutureTask.java:242)
                                                                                          at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:231)
                                                                                          at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112)
                                                                                          at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587)
                                                                                          at java.lang.Thread.run(Thread.java:818)
                                                                                       Caused by: java.lang.UnsatisfiedLinkError: No implementation found for void org.opencv.imgproc.Imgproc.Canny_3(long, long, double, double) (tried Java_org_opencv_imgproc_Imgproc_Canny_13 and Java_org_opencv_imgproc_Imgproc_Canny_13__JJDD)
                                                                                          at org.opencv.imgproc.Imgproc.Canny_3(Native Method)
                                                                                          at org.opencv.imgproc.Imgproc.Canny(Imgproc.java:1038)
                                                                                          at com.gurankas.bhism.documentscanner.MyAsyncTask.doInBackground(MyAsyncTask.java:98)
                                                                                          at com.gurankas.bhism.documentscanner.MyAsyncTask.doInBackground(MyAsyncTask.java:27)
                                                                                          at android.os.AsyncTask$2.call(AsyncTask.java:292)
                                                                                          at java.util.concurrent.FutureTask.run(FutureTask.java:237)
                                                                                          at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:231) 
                                                                                          at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112) 
                                                                                          at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587) 
                                                                                          at java.lang.Thread.run(Thread.java:818) 
    08-29 12:15:47.845 6248-7428/com.gurankas.bhism.documentscanner I/Process: Sending signal. PID: 6248 SIG: 9

像这样输出

package com.gurankas.bhism.documentscanner;

import android.content.Context;
import android.graphics.Bitmap;
import android.util.Log;
import android.widget.Toast;

import org.opencv.android.Utils;
import org.opencv.core.Core;
import org.opencv.core.CvType;
import org.opencv.core.Mat;
import org.opencv.core.MatOfPoint;
import org.opencv.core.Point;
import org.opencv.core.Scalar;
import org.opencv.core.Size;
import org.opencv.core.TermCriteria;
import org.opencv.imgproc.Imgproc;
import org.opencv.utils.Converters;

import java.util.ArrayList;
import java.util.List;

/**
 * Created by bhism on 8/23/2017.
 */

public class MyAsyncTask extends android.os.AsyncTask {
    private static final String TAG = "AsyncTask";
    Bitmap bitmap1;
    Context context;


    //constructor initializing the context field
    public MyAsyncTask(Context context){
        this.context=context;
    }


    //Main method for the computation of the image
    @Override
    protected Bitmap doInBackground(Object[] objects)
    {

            //set-up of the image in the desired format for k-means clustering
            Mat samples = new Mat(((LensActivity)context).src.rows() * ((LensActivity)context).src.cols(), 3, CvType.CV_32F);
            for( int y = 0; y < ((LensActivity)context).src.rows(); y++ )
            {
                for( int x = 0; x < ((LensActivity)context).src.cols(); x++ )
                {
                    for( int z = 0; z < 3; z++)
                    {
                        samples.put(x + y*((LensActivity)context).src.cols(), z, ((LensActivity)context).src.get(y,x)[z]);
                    }
                }
            }

            //applying k-means clustering
            int clusterCount = 2;
            Mat labels = new Mat();
            int attempts = 5;
            Mat centers = new Mat();
            Core.kmeans(samples, clusterCount, labels, new TermCriteria(TermCriteria.MAX_ITER |
                            TermCriteria.EPS, 10000, 0.0001), attempts,
                    Core.KMEANS_PP_CENTERS, centers);


            //The image with the colour nearest to white will be considered as foreground
            double dstCenter0 = calcWhiteDist(centers.get(0, 0)[0], centers.get(0, 1)[0], centers.get(0, 2)[0]);
            double dstCenter1 = calcWhiteDist(centers.get(1, 0)[0], centers.get(1, 1)[0], centers.get(1, 2)[0]);
            int paperCluster = (dstCenter0 < dstCenter1)?0:1;


            //definition of 2 Mat objects needed for next step
            Mat srcRes = new Mat( ((LensActivity)context).src.size(), ((LensActivity)context).src.type() );
            Mat srcGray = new Mat();


            //Performing Segmentation ie displaying all foreground pixels as white and all background pixels as black
            for( int y = 0; y < ((LensActivity)context).src.rows(); y++ )
            {
                for( int x = 0; x < ((LensActivity)context).src.cols(); x++)
                {
                    int cluster_idx = (int)labels.get(x + y*((LensActivity)context).src.cols(),0)[0];
                    if(cluster_idx != paperCluster)
                    {
                        srcRes.put(y,x, 0, 0, 0, 255);
                    }
                    else
                    {
                        srcRes.put(y,x, 255, 255, 255, 255);
                    }
                }
            }


            //Apply canny edge detection and then find contours
            Imgproc.cvtColor(((LensActivity)context).src, srcGray, Imgproc.COLOR_BGR2GRAY);
             Imgproc.Canny(srcGray, srcGray, 50, 150);
            List<MatOfPoint> contours = new ArrayList<MatOfPoint>();
            Mat hierarchy = new Mat();

            Imgproc.findContours(srcGray, contours, hierarchy, Imgproc.RETR_TREE, Imgproc.CHAIN_APPROX_SIMPLE);


            //Finding the biggest contour corresponding to the page in the image
            int index = 0;
            double maxim = Imgproc.contourArea(contours.get(0));

            for (int contourIdx = 1; contourIdx < contours.size();contourIdx++)
            {
                double temp;
                temp=Imgproc.contourArea(contours.get(contourIdx));
                if(maxim<temp)
                {
                    maxim=temp;
                    index=contourIdx;
                }
            }
            Mat drawing = Mat.zeros(srcRes.size(), CvType.CV_8UC1);
            Imgproc.drawContours(drawing, contours, index, new Scalar(255), 1);


            //lines corresponding to the biggest contours used to find the intersection points of these lines to find the corners
            Mat lines = new Mat();
            Imgproc.HoughLinesP(drawing, lines, 1, Math.PI/180, 70, 30, 10);

            ArrayList<Point> corners = new ArrayList<Point>();
            for (int i = 0; i < lines.cols(); i++)
            {
                for (int j = i+1; j < lines.cols(); j++) {
                    double[] line1 = lines.get(0, i);
                    double[] line2 = lines.get(0, j);

                    Point pt = findIntersection(line1, line2);
                    Log.d("com.packtpub.chapter9", pt.x+" "+pt.y);
                    if (pt.x >= 0 && pt.y >= 0 && pt.x <= drawing.cols() && pt.y <= drawing.rows())
                    {
                        if(!exists(corners, pt)){
                            corners.add(pt);
                            Log.d(TAG, "Corner is "+ pt);
                        }
                    }
                    Log.d(TAG, "Point is " + pt);
                }
            }

            sortCorners(corners);


            //determination of resultant image size
            double top = Math.sqrt(Math.pow(corners.get(0).x - corners.get(1).x, 2) + Math.pow(corners.get(0).y - corners.get(1).y, 2));

            double right = Math.sqrt(Math.pow(corners.get(1).x - corners.get(2).x, 2) + Math.pow(corners.get(1).y - corners.get(2).y, 2));

            double bottom = Math.sqrt(Math.pow(corners.get(2).x - corners.get(3).x, 2) + Math.pow(corners.get(2).y - corners.get(3).y, 2));

            double left = Math.sqrt(Math.pow(corners.get(3).x - corners.get(1).x, 2) + Math.pow(corners.get(3).y - corners.get(1).y, 2));
            Mat quad = Mat.zeros(new Size(Math.max(top, bottom), Math.max(left, right)), CvType.CV_8UC3);


            //refernce corners
            ArrayList<Point> result_pts = new ArrayList<Point>();
            result_pts.add(new Point(0, 0));
            result_pts.add(new Point(quad.cols(), 0));
            result_pts.add(new Point(quad.cols(), quad.rows()));
            result_pts.add(new Point(0, quad.rows()));

            //perspective transformation
            Mat cornerPts = Converters.vector_Point2f_to_Mat(corners);
            Mat resultPts = Converters.vector_Point2f_to_Mat(result_pts);

            Mat transformation = Imgproc.getPerspectiveTransform(cornerPts,
                    resultPts);
            Imgproc.warpPerspective(((LensActivity)context).srcOrig, quad, transformation,
                    quad.size());
            Imgproc.cvtColor(quad, quad, Imgproc.COLOR_BGR2RGBA);

             Bitmap bitmap = Bitmap.createBitmap(quad.cols(), quad.rows(),Bitmap.Config.ARGB_8888);

            Utils.matToBitmap(quad, bitmap);

            //bitmap1 = Bitmap.createBitmap(bitmap.getHeight(), bitmap.getWidth(), bitmap.getConfig());
            bitmap1 = bitmap.copy(Bitmap.Config.ARGB_8888,true);
            return bitmap;

    }
    //identification of 4 corners of the quadrilateral with respect to the center
     void sortCorners(ArrayList<Point> corners)
    {
        ArrayList<Point> top, bottom;

        top = new ArrayList<Point>();
        bottom = new ArrayList<Point>();

        Point center = new Point();

        for(int i=0; i<corners.size(); i++){
            center.x += corners.get(i).x/corners.size();
            center.y += corners.get(i).y/corners.size();
        }

        for (int i = 0; i < corners.size(); i++)
        {
            if (corners.get(i).y < center.y)
                top.add(corners.get(i));
            else
                bottom.add(corners.get(i));
        }
        corners.clear();

        if (top.size() == 2 && bottom.size() == 2){
            Point top_left = top.get(0).x > top.get(1).x ?
                    top.get(1) : top.get(0);
            Point top_right = top.get(0).x > top.get(1).x ?
                    top.get(0) : top.get(1);
            Point bottom_left = bottom.get(0).x > bottom.get(1).x
                    ? bottom.get(1) : bottom.get(0);
            Point bottom_right = bottom.get(0).x > bottom.get(1).x
                    ? bottom.get(0) : bottom.get(1);

            top_left.x *= ((LensActivity)context).scaleFactor;
            top_left.y *= ((LensActivity)context).scaleFactor;

            top_right.x *= ((LensActivity)context).scaleFactor;
            top_right.y *= ((LensActivity)context).scaleFactor;

            bottom_left.x *= ((LensActivity)context).scaleFactor;
            bottom_left.y *= ((LensActivity)context).scaleFactor;

            bottom_right.x *= ((LensActivity)context).scaleFactor;
            bottom_right.y *= ((LensActivity)context).scaleFactor;

            corners.add(top_left);
            corners.add(top_right);
            corners.add(bottom_right);
            corners.add(bottom_left);
        }
    }

    //removal of redundant points in the intersection lines of the biggest contours
    static boolean exists(ArrayList<Point> corners, Point pt){
        for(int i=0; i<corners.size(); i++){
            if(Math.sqrt(Math.pow(corners.get(i).x-pt.x,
                    2)+Math.pow(corners.get(i).y-pt.y, 2)) < 10){
                return true;
            }
        }
        return false;
    }

    static Point findIntersection(double[] line1, double[] line2) {
        double start_x1 = line1[0], start_y1 = line1[1],
                end_x1 = line1[2], end_y1 = line1[3], start_x2 =
                line2[0], start_y2 = line2[1], end_x2 = line2[2],
                end_y2 = line2[3];
        double denominator = ((start_x1 - end_x1) * (start_y2 -
                end_y2)) - ((start_y1 - end_y1) * (start_x2 - end_x2));

        if (denominator!=0)
        {
            Point pt = new Point();
            pt.x = ((start_x1 * end_y1 - start_y1 * end_x1) *
                    (start_x2 - end_x2) - (start_x1 - end_x1) *
                    (start_x2 * end_y2 - start_y2 * end_x2)) /
                    denominator;
            pt.y = ((start_x1 * end_y1 - start_y1 * end_x1) *
                    (start_y2 - end_y2) - (start_y1 - end_y1) *
                    (start_x2 * end_y2 - start_y2 * end_x2)) /
                    denominator;
            return pt;
        }
        else
            return new Point(-1, -1);
    }

    static double calcWhiteDist(double r, double g, double b){
        return Math.sqrt(Math.pow(255 - r, 2) +
                Math.pow(255 - g, 2) + Math.pow(255 - b, 2));
    }

    @Override
    protected void onPostExecute(Object o) {
        super.onPostExecute(o);
        if(bitmap1!=null) {
            ((LensActivity)context).ivImage.setImageBitmap(bitmap1);
        } else if (((LensActivity)context).errorMsg != null){
           // Toast.makeText(this, lensActivity.errorMsg, Toast.LENGTH_SHORT).show();
            Log.d(TAG, ((LensActivity)context).errorMsg);
        }
    }
}

答案 1 :(得分:1)

为此使用有效负载工厂怎么样?

<payloadFactory description="Add Soap Envelop" media-type="xml">
    <format>
        <soapenv:Envelope xmlns:bas="http://www.3gpp.org/ftp/Specs/archive/32_series/32607/schema/32607-700/BasicCMIRPData" xmlns:soap="http://www.alcatel-lucent.com/soap_cm" xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/">
            <soapenv:Header/>
            <soapenv:Body>
                $1
            </soapenv:Body>
        </soapenv:Envelope>
    </format>
    <args>
        <arg evaluator="xml" expression="get-property('INPUT_MESSAGE')"/>
    </args>
</payloadFactory>

请查看payload factory documentation

中的示例5

答案 2 :(得分:0)

使用有效负载工厂将为您提供一个干净的解决方案。如果要使用rich mediator来设置包络,则需要先将其保存在属性中,然后在使用rich mediator中使用property。与首先将属性保存在属性中然后在富集介体中使用它相同。 希望有所帮助。