spring-integration:无法查找名为'errorChannel'的MessageChannel

时间:2016-06-29 13:17:39

标签: spring-integration

我有像这样的过程链

 <int:chain input-channel="filesInChannel" output-channel="outJsonMap" >
    <int:transformer id="confCombiner" ref="serviceCombiner" method="addConfig"/>
    <int:header-enricher>
        <int:header name="fileMatchConf" expression="payload.get('matchConf')"/>
        <int:header name="fileName" expression="payload.get('file').getName()"/>
        <int:header name="fileTimeProp" expression="payload.get('fileTimeProp')"/>
    </int:header-enricher>

    <int:transformer expression="payload.get('file')"/>
    <int:transformer id="sasTransformer" ref="serviceTransformer" method="process" />
</int:chain>

最后一个过程是将sas文件转换为pojo。这个应用程序适用于大量文件,我们使用测试 - + 1000文件。我发现我的应用程序有奇怪的行为。有时应用程序会顺利处理所有文件,但有时应用程序会突然停止并出现异常,如下所示:

2016-06-29 09:47:05 44591 [Thread-1014] DEBUG org.springframework.jdbc.datasource.DataSourceUtils  - Fetching JDBC Connection from DataSource
2016-06-29 09:47:05 44592 [executor-86] DEBUG org.springframework.beans.factory.support.DefaultListableBeanFactory - Returning cached instance of singleton bean 'integrationHeaderChannelRegistry'
Exception in thread "executor-86" Exception in thread "Thread-1014" 2016-06-29 09:47:05 44112 [executor-31] DEBUG com.epam.parso.impl.SasFileParser  - Column format: $
2016-06-29 09:47:05 44592 [executor-31] DEBUG com.epam.parso.impl.SasFileParser  - Subheader process function name: FORMAT_AND_LABEL_SUBHEADER_INDEX
org.springframework.messaging.core.DestinationResolutionException: failed to look up MessageChannel with name 'errorChannel' in the BeanFactory.; nested exception is org.springframework.beans.factory.BeanCreationNotAllowedException: Error creating bean with na  me 'errorChannel': Singleton bean creation not allowed while the singletons of this factory are in destruction (Do not request a b  ean from a BeanFactory in a destroy method implementation!) at org.springframework.integration.support.channel.BeanFactoryChannelResolver.resolveDestination(BeanFactoryChannelResolver.java:112)
at org.springframework.integration.support.channel.BeanFactoryChannelResolver.resolveDestination(BeanFactoryChannelResolver.java:45)
at org.springframework.integration.channel.MessagePublishingErrorHandler.resolveErrorChannel(MessagePublishingErrorHandler.java:117)
at org.springframework.integration.channel.MessagePublishingErrorHandler.handleError(MessagePublishingErrorHandler.java:80)
at org.springframework.integration.util.ErrorHandlingTaskExecutor$1.run(ErrorHandlingTaskExecutor.java:58)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
2016-06-29 09:47:05 44592 [executor-31]DEBUG com.epam.parso.impl.SasFileParser  - Column format: $at java.lang.Thread.run(Thread.java:745)
Caused by: org.springframework.beans.factory.BeanCreationNotAllowedException: Error creating bean with name 'errorChannel': Single  ton bean creation not allowed while the singletons of this factory are in destruction (Do not request a bean from a BeanFactory in   a destroy method implementation!)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:2  16)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:302)
2016-06-29 09:47:05 44593 [executor-31] DEBUG com.epam.parso.impl.SasFileParser  - Subheader process function name:

在该异常中,SasFileParser在方法“进程”中的serviceTransformer内被调用。

这是变换器代码的实现:

 public JobMessage process(@Payload File file, @Headers MessageHeaders headers) throws IOException {

        JobMessage jobMessage = new JobMessage();
        logger.info("headers:"+headers.get("fileMatchConf"));
        FileInputStream is = null;
        String timestamp_datameerJob = String.valueOf(System.currentTimeMillis());

        try {

            // create message parser same like in SuccessUnwrapper
            Adr adr = (Adr) headers.get("fileMatchConf");

            is = new FileInputStream(file);
            com.epam.parso.SasFileReader sasFileReader = new SasFileReaderImpl(is);

            String fileName = file.getName();
            String absolutePath = file.getAbsolutePath();
            String dirCode = adr.getDirectoryCode();
            String scenario = "";
            String tableName = "";
            String loadType = adr.getLoadType();

            Path fileP = Paths.get(file.getAbsolutePath());
            BasicFileAttributes attr = Files.readAttributes(fileP, BasicFileAttributes.class);
            Long timestamp_createdDate = attr.creationTime().toMillis();



            // get tableName
            // check load_type (M,R) or (H,S)
            if(loadType.toLowerCase().equals("m") || loadType.toLowerCase().equals("r")){
                tableName = adr.getTargetTableName();
            } else{
                // check which scenario used
                scenario = FileUtil.filePatternScenarioSelector(fileName);
                logger.info("[SCENARIO] " + scenario);

                if(scenario.equals("scenario2")){
                    SimpleDateFormat formatter = new SimpleDateFormat("yyyyMMdd");
                    String createdDate = formatter.format(new java.util.Date (timestamp_createdDate));
                    String mainFileName = fileName.split("\\.")[0];

                    tableName = mainFileName+"_"+createdDate+"_"+ dirCode;
                } else if(scenario.equals("scenario1_3")){
                    tableName = NamingFunctionMapUtil.getInstance().getScenarioOneThree(fileName, dirCode).get("appendDirID");
                }
            }


            // import job
            ImportConstructor imc = new ImportConstructor(sasFileReader.getColumns(), fileName);
            imc.constructImport(absolutePath, dirCode, timestamp_datameerJob); // real environment
            ImportJob importJob = imc.getImportJob();
            logger.info("[IMPORT_JOB] " + importJob);


            // workbook job
            WorkBookConstructor wrk = new WorkBookConstructor(importJob, fileName);
            wrk.constructWorkbook(dirCode, timestamp_datameerJob);
            id.lsa.scb.mappers.workbook.WorkBook workBook = wrk.getWorkBook();
            logger.info("[WORKBOOK_JOB]" + workBook);


            // export job
            ExportConstructor exc = new ExportConstructor(sasFileReader.getColumns(), fileName, workBook);
            exc.constructExportJob(adr.getTargetDatabase(), tableName, dirCode, timestamp_datameerJob);
            id.lsa.scb.mappers.exportjob.ExportJob exportJob = exc.getExportJob();
            logger.info("[EXPORT_JOB]" + exportJob);

            jobMessage.setFileName(fileName);
            jobMessage.setTableName(tableName);
            jobMessage.setCountryCode(adr.getCountryCode());
            jobMessage.setDirectoryPath(adr.getDirectoryPath());
            jobMessage.setDirectoryCode(adr.getDirectoryCode());
            jobMessage.setFilePatternUsed(adr.getFilePattern());
            jobMessage.setTargetDatabase(adr.getTargetDatabase());
            jobMessage.setLoadType(adr.getLoadType());

            jobMessage.setImportJob(importJob);
            jobMessage.setWorkBook(workBook);
            jobMessage.setExportJob(exportJob);

            jobMessage.setFileCredentials(constructCredentialFiles(file));


        } catch (FileNotFoundException e) {
            e.printStackTrace();

        }catch (IOException ioex){
            String message= ioex.getMessage();
            logger.error(message);
            logToDb(headers,"failed","file-convert",message);
        }
        catch (Exception ex) {
            ex.printStackTrace();
            String message= ex.getMessage();
            logToDb(headers,"failed","file-convert",message);
        } finally {
            if(is!=null) {
                is.close();
                is=null;
            }
        }

在“confCombiner”中我也得到了sas文件的属性:

Path fileP = Paths.get(file.getAbsolutePath());
        BasicFileAttributes attr = null;
        try {
            attr = Files.readAttributes(fileP, BasicFileAttributes.class);
        } catch (IOException e) {
            e.printStackTrace();
        }
        Long fileCreatedTime = attr.creationTime().toMillis();
        Long fileModifiedTime = attr.lastModifiedTime().toMillis();

任何人都有过这种例外的经历吗?怎么处理?或任何其他模式的建议,以适应这种流动?或至少知道问题的根源。对我来说这看起来很奇怪,因为我为每个正在运行的测试使用相同的文件示例。如果发生此异常,我需要应用程序继续运行以处理下一个文件。感谢

1 个答案:

答案 0 :(得分:1)

com.epam.parso.impl.SasFileParser  - Column format: $
    2016-06-29 09:47:05 44592 [executor-31] DEBUG com.epam.parso.impl.SasFileParser  - Subheader process function name: FORMAT_AND_LABEL_SUBHEADER_INDEX

这是您的代码和您自己的问题。我担心在没有任何其他背景的情况下我们无法帮助你。

问题如:

Error creating bean with name 'errorChannel': Singleton bean creation not allowed while the singletons of this factory are in destruction (Do not request a bean from a BeanFactory in a destroy method implementation!)

表示您有一些意外的destroy()电话会导致即时流程以不良方式完成。

所以,请重新考虑如何处理流程或整个应用程序。特别是破坏或退出部分。