将AutocompleteMode设置为“建议”无效

时间:2018-11-11 12:48:17

标签: winforms combobox autocomplete entity-framework-6 bindingsource

我的组合框很少。我通过绑定源(我使用EF6)初始化它们的数据源。我的组合框任务如下:

combobox tasks image

因此,当我将AutoCompleteMode属性设置为“ Suggested”时,自动完成不起作用。奇怪的是,我还有另外两个组合框,它们的设置完全相同(我会说),并且它们起作用了……自动补全对它们起作用。

如何调试它?

1 个答案:

答案 0 :(得分:1)

要设置自动完成用户输入字符串的ComboBox,我们需要配置3个不同的属性。文档中的说明:

  

使用AutoCompleteCustomSourceAutoCompleteMode和   AutoCompleteSource个属性来创建一个ComboBox   通过比较以下前缀自动自动完成输入字符串   输入维护的源中所有字符串的前缀

AutoCompleteCustomSource:提供自动完成字符串来源的专业项目集合。
AutoCompleteMode:定义如何执行输入自动完成。
AutoCompleteSource:指定完成功能的来源。

使用已定义路径中的“文件”或“目录”列表,FileSystemRecentUsedItems URL或在这种情况下为AutoCompleteSource.CustomSource,后者可以是HistoryList ,指示要完成的项目的源列表由AutoCompleteStringCollection提供,可以指定AutoCompleteCustomSource属性,将AutoCompleteStringCollection属性分配给List或另一个兼容的源(IEnumerable用于例如,由于此集合实现了IList # Licensed to the Apache Software Foundation (ASF) under one # or more contributor license agreements. See the NOTICE file # distributed with this work for additional information # regarding copyright ownership. The ASF licenses this file # to you under the Apache License, Version 2.0 (the # "License"); you may not use this file except in compliance # with the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # Define some default values that can be overridden by system properties hive.log.threshold=ALL hive.root.logger=INFO,DRFA hive.log.dir=${java.io.tmpdir}/${user.name} hive.log.file=hive.log # Define the root logger to the system property "hadoop.root.logger". log4j.rootLogger=${hive.root.logger}, EventCounter # Logging Threshold log4j.threshold=${hive.log.threshold} # # Daily Rolling File Appender # # Use the PidDailyerRollingFileAppend class instead if you want to use separate log files # for different CLI session. # # log4j.appender.DRFA=org.apache.hadoop.hive.ql.log.PidDailyRollingFileAppender log4j.appender.DRFA=org.apache.log4j.DailyRollingFileAppender log4j.appender.DRFA.File=${hive.log.dir}/${hive.log.file} # Rollver at midnight log4j.appender.DRFA.DatePattern=.yyyy-MM-dd # 30-day backup #log4j.appender.DRFA.MaxBackupIndex= 30 log4j.appender.DRFA.MaxFileSize = 256MB log4j.appender.DRFA.layout=org.apache.log4j.PatternLayout # Pattern format: Date LogLevel LoggerName LogMessage #log4j.appender.DRFA.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n # Debugging Pattern format log4j.appender.DRFA.layout.ConversionPattern=%d{ISO8601} %-5p [%t]: %c{2} (%F:%M(%L)) - %m%n # # console # Add "console" to rootlogger above if you want to use this # log4j.appender.console=org.apache.log4j.ConsoleAppender log4j.appender.console.target=System.err log4j.appender.console.layout=org.apache.log4j.PatternLayout log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} [%t]: %p %c{2}: %m%n log4j.appender.console.encoding=UTF-8 #custom logging levels #log4j.logger.xxx=DEBUG # # Event Counter Appender # Sends counts of logging messages at different severity levels to Hadoop Metrics. # log4j.appender.EventCounter=org.apache.hadoop.hive.shims.HiveEventCounter log4j.category.DataNucleus=ERROR,DRFA log4j.category.Datastore=ERROR,DRFA log4j.category.Datastore.Schema=ERROR,DRFA log4j.category.JPOX.Datastore=ERROR,DRFA log4j.category.JPOX.Plugin=ERROR,DRFA log4j.appender.console=org.apache.log4j.ConsoleAppender log4j.appender.console.target=System.err log4j.appender.console.layout=org.apache.log4j.PatternLayout log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} [%t]: %p %c{2}: %m%n log4j.appender.console.encoding=UTF-8 #custom logging levels #log4j.logger.xxx=DEBUG # # Event Counter Appender # Sends counts of logging messages at different severity levels to Hadoop Metrics. # log4j.appender.EventCounter=org.apache.hadoop.hive.shims.HiveEventCounter log4j.category.DataNucleus=ERROR,DRFA log4j.category.Datastore=ERROR,DRFA log4j.category.Datastore.Schema=ERROR,DRFA log4j.category.JPOX.Datastore=ERROR,DRFA log4j.category.JPOX.Plugin=ERROR,DRFA 接口)。