使用多维数组编写数学表

时间:2016-08-29 18:47:37

标签: php html arrays multidimensional-array

我正在尝试使用for循环将数学表从1写入表中并存储到多维数组中但不能这样做。每当我执行脚本时,我都会收到以下错误

  

注意:未定义的偏移量:C:\ xampp \ htdocs \ Examples :: PHP_Object.php中的0   注意:C:\ xampp \ htdocs \ Examples \ PHP_Object.php中的数组到字符串转换

这是我的代码:

<?php
$tbl=array();
echo "<table >";
$x=0;
$y=0;
for ($tb = 1 ; $tb <= 10; $tb++) {
    echo"<tr>";
    $tbl[$x] = array();
    for($no = 1;$no <=10; $no++ ) {
        $z = $tb * $no ;
        $tbl[$x][$y];
        echo "<th> $tbl[$x][$y] = $z </th>";
        $y = $y+1;
    }
    echo "</tr>";
    $x = $x+1;
}
echo "</table>";
?>

2 个答案:

答案 0 :(得分:0)

试试这个:

import org.apache.hadoop.hbase.HBaseConfiguration
import org.apache.spark.SparkConf
import org.apache.spark.streaming.StreamingContext
import org.apache.hadoop.hbase.client.Put
import org.apache.hadoop.hbase.io.ImmutableBytesWritable
import org.apache.hadoop.hbase.mapreduce.TableOutputFormat
import org.apache.hadoop.hbase.util.Bytes
import org.apache.hadoop.mapred.JobConf
import org.apache.spark.streaming.Seconds
import org.apache.spark.streaming.StreamingContext
import org.apache.spark.streaming.kafka._
import kafka.serializer.StringDecoder
import org.apache.hadoop.io.{LongWritable, Writable, IntWritable, Text}
import org.apache.hadoop.mapreduce.Job

object ReceiveKafkaAsDstream {
  case class SampleKafkaRecord(id: String, name: String)
  object SampleKafkaRecord extends Serializable {
    def parseToSampleRecord(line: String): SampleKafkaRecord = {
      val values = line.split(";")
      SampleKafkaRecord(values(0), values(1))
    }

    def SampleToHbasePut(CSVData: SampleKafkaRecord): (ImmutableBytesWritable, Put) = {
      val rowKey = CSVData.id
      val putOnce = new Put(rowKey.getBytes)

      putOnce.addColumn("cf1".getBytes, "column-Name".getBytes, CSVData.name.getBytes)
      return (new ImmutableBytesWritable(rowKey.getBytes), putOnce)
    }
  }


  def main(args: Array[String]): Unit = {
    val sparkConf = new SparkConf().setAppName("ReceiveKafkaAsDstream")
    val ssc = new StreamingContext(sparkConf, Seconds(1))

    val topics = "test"
    val brokers = "10.0.2.15:6667"

    val topicSet = topics.split(",").toSet
    val kafkaParams = Map[String, String]("metadata.broker.list" -> brokers,
        "zookeeper.connection.timeout.ms" -> "1000")

    val messages = KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder](ssc, kafkaParams, topicSet)

    val tableName = "KafkaTable"
    val conf = HBaseConfiguration.create()
    conf.set(TableOutputFormat.OUTPUT_TABLE, tableName)
    conf.set("zookeeper.znode.parent", "/hbase-unsecure")
    conf.set("hbase.zookeeper.property.clientPort", "2181")

    val job = Job.getInstance(conf)
    job.setOutputKeyClass(classOf[Text])
    job.setOutputValueClass(classOf[Text])
    job.setOutputFormatClass(classOf[TableOutputFormat[Text]])

    val records = messages
      .map(_._2)
      .map(SampleKafkaRecord.parseToSampleRecord)

    records
      .foreachRDD{ rdd => {
        rdd.map(SampleKafkaRecord.SampleToHbasePut).saveAsNewAPIHadoopDataset(job.getConfiguration) }
      }
    records.print()  

    ssc.start()
    ssc.awaitTermination()
  }
}

代码中的以下行是错误的:

<?php
    $tbl=array();
    echo "<table >";
    $x=0;
    $y=0;
    for ($tb = 1 ; $tb <= 10; $tb++) {
    echo"<tr>";
    $tbl[$x] = array();
        for($no = 1;$no <=10; $no++ ) {
            $z = $tb * $no ;
            $tbl[$x][$y] = $z;
            echo "<th> ".$tbl[$x][$y]. "</th>";
        $y = $y+1;
        }
     echo "</tr>";
     $x = $x+1;
     }
     echo "</table>";
     ?>

当你将$tbl[$x][$y]; echo "<th> $tbl[$x][$y] = $z </th>"; 放在双引号和$tbl[$x][$y] = $z中时,PHP不会在变量中分配任何内容,而是会尝试回显指定变量的值和{{ 1}}符号将被视为将按原样打印的字符串。

答案 1 :(得分:0)

这一行没有做任何事情:

    $tbl[$x][$y];

您需要一个存储到数组中的赋值:

    $tbl[$x][$y] = $z;

此行导致警告消息:

    echo "<th> $tbl[$x][$y] = $z </th>";

为了将多维数组变量插入到字符串中,并且还要评估数组的索引,必须将其括在花括号中:

    echo "<th> {$tbl[$x][$y]} = $z </th>";

由于您没有这样做,它会尝试打印整个$tbl数组,但您无法使用echo打印数组。

但我不明白为什么你需要将$tbl[$x][$y]= $z同时放在表格中,因为它们是相同的,所以你应该写:< / p>

    echo "<th> $z </th>";