当前位置: 首页 > news >正文

成华区建设局网站软文营销定义

成华区建设局网站,软文营销定义,互联网设计一般是什么专业,杭州做美妆的网站目录 1. 下载dataset 2. 读取并做可视化 3. 源码阅读 3.1 读取点云数据-bin格式 3.2 读取标注数据-.label文件 3.3 读取配置 3.4 test 3.5 train 1. 下载dataset 以SemanticKITTI为例。下载链接:http://semantic-kitti.org/dataset.html#download 把上面三…

目录

1. 下载dataset

2. 读取并做可视化

3. 源码阅读

3.1 读取点云数据-bin格式

3.2 读取标注数据-.label文件

3.3 读取配置

3.4 test

3.5 train


1. 下载dataset

以SemanticKITTI为例。下载链接:http://semantic-kitti.org/dataset.html#download

把上面三个下载下来。

 

同级目录下解压

unzip data_odometry_labels.zip
unzip data_odometry_velodyne.zip
unzip data_odometry_calib.zip

解压后文件夹形式:

2. 读取并做可视化

import open3d.ml.torch as ml3d  # or open3d.ml.tf as ml3d# construct a dataset by specifying dataset_path
dataset = ml3d.datasets.SemanticKITTI(dataset_path='/home/zxq/data/kitti')# get the 'all' split that combines training, validation and test set
all_split = dataset.get_split('all')# print the attributes of the first datum
print(all_split.get_attr(0))# print the shape of the first point cloud
print(all_split.get_data(0)['point'].shape)# show the first 100 frames using the visualizer
vis = ml3d.vis.Visualizer()
vis.visualize_dataset(dataset, 'all', indices=range(100))

点云分割数据集SemanticKITTI

3. 源码阅读

3.1 读取点云数据-bin格式

SemanticKITTI的点云和标注数据都是二进制文件。

datatsets/utils/dataprocessing.py 

    @staticmethoddef load_pc_kitti(pc_path):  # "./000000.bin"scan = np.fromfile(pc_path, dtype=np.float32)  # (num_pt*4,)scan = scan.reshape((-1, 4))    # # (num_pt,4)# points = scan[:, 0:3]  # get xyzpoints = scanreturn points

3.2 读取标注数据-.label文件

    def load_label_kitti(label_path, remap_lut):label = np.fromfile(label_path, dtype=np.uint32)label = label.reshape((-1))sem_label = label & 0xFFFF  # semantic label in lower half inst_label = label >> 16  # instance id in upper halfassert ((sem_label + (inst_label << 16) == label).all())sem_label = remap_lut[sem_label]return sem_label.astype(np.int32)

3.3 读取配置

模型,数据集,流程配置都保存在ml3d/configs/*.yaml文件中。读取方式:

import open3d.ml as _ml3d
import open3d.ml.torch as ml3d # or open3d.ml.tf as ml3dframework = "torch" # or tf
cfg_file = "ml3d/configs/randlanet_semantickitti.yml"
cfg = _ml3d.utils.Config.load_from_file(cfg_file)# fetch the classes by the name
Pipeline = _ml3d.utils.get_module("pipeline", cfg.pipeline.name, framework)
Model = _ml3d.utils.get_module("model", cfg.model.name, framework)
Dataset = _ml3d.utils.get_module("dataset", cfg.dataset.name)# use the arguments in the config file to construct the instances
cfg.dataset['dataset_path'] = "/home/zxq/data/kitti"
dataset = Dataset(cfg.dataset.pop('dataset_path', None), **cfg.dataset)
model = Model(**cfg.model)
pipeline = Pipeline(model, dataset, **cfg.pipeline)

3.4 test

import os
import open3d.ml as _ml3d
import open3d.ml.torch as ml3dcfg_file = "ml3d/configs/randlanet_semantickitti.yml"
cfg = _ml3d.utils.Config.load_from_file(cfg_file)model = ml3d.models.RandLANet(**cfg.model)
cfg.dataset['dataset_path'] = "/home/zxq/data/kitti"
dataset = ml3d.datasets.SemanticKITTI(cfg.dataset.pop('dataset_path', None), **cfg.dataset)
pipeline = ml3d.pipelines.SemanticSegmentation(model, dataset=dataset, device="gpu", **cfg.pipeline)# download the weights.
ckpt_folder = "./logs/"
os.makedirs(ckpt_folder, exist_ok=True)
ckpt_path = ckpt_folder + "randlanet_semantickitti_202201071330utc.pth"
randlanet_url = "https://storage.googleapis.com/open3d-releases/model-zoo/randlanet_semantickitti_202201071330utc.pth"
if not os.path.exists(ckpt_path):cmd = "wget {} -O {}".format(randlanet_url, ckpt_path)os.system(cmd)# load the parameters.
pipeline.load_ckpt(ckpt_path=ckpt_path)test_split = dataset.get_split("test")# run inference on a single example.
# returns dict with 'predict_labels' and 'predict_scores'.
data = test_split.get_data(0)
result = pipeline.run_inference(data)# evaluate performance on the test set; this will write logs to './logs'.
pipeline.run_test()

3.5 train

import open3d.ml.torch as ml3dfrom ml3d.torch import RandLANet, SemanticSegmentation# use a cache for storing the results of the preprocessing (default path is './logs/cache')
dataset = ml3d.datasets.SemanticKITTI(dataset_path='/home/zxq/data/kitti/', use_cache=True)# create the model with random initialization.
model = RandLANet()pipeline = SemanticSegmentation(model=model, dataset=dataset, max_epoch=100)# prints training progress in the console.
pipeline.run_train()


文章转载自:
http://wanjiagraphonomy.xhqr.cn
http://wanjiaimbroglio.xhqr.cn
http://wanjiafeasibility.xhqr.cn
http://wanjiaanthropogeography.xhqr.cn
http://wanjiabeemistress.xhqr.cn
http://wanjiaorthotropism.xhqr.cn
http://wanjiasiphonage.xhqr.cn
http://wanjiabaronize.xhqr.cn
http://wanjiabulbar.xhqr.cn
http://wanjiapop.xhqr.cn
http://wanjianecrobacillosis.xhqr.cn
http://wanjiabalneal.xhqr.cn
http://wanjiaplucky.xhqr.cn
http://wanjiacoupling.xhqr.cn
http://wanjialumbaginous.xhqr.cn
http://wanjiainterwound.xhqr.cn
http://wanjiafillibuster.xhqr.cn
http://wanjiaelegist.xhqr.cn
http://wanjiaorthodontist.xhqr.cn
http://wanjiaadlib.xhqr.cn
http://wanjiaconscript.xhqr.cn
http://wanjiasnowcem.xhqr.cn
http://wanjiapotty.xhqr.cn
http://wanjiabangkok.xhqr.cn
http://wanjiaoverprotection.xhqr.cn
http://wanjiaphycocyanin.xhqr.cn
http://wanjiamidden.xhqr.cn
http://wanjiaharewood.xhqr.cn
http://wanjiafreeness.xhqr.cn
http://wanjiadlp.xhqr.cn
http://wanjiagst.xhqr.cn
http://wanjiafornical.xhqr.cn
http://wanjiapreceptor.xhqr.cn
http://wanjiamindon.xhqr.cn
http://wanjiaexarch.xhqr.cn
http://wanjiacommercialize.xhqr.cn
http://wanjiajules.xhqr.cn
http://wanjiaciliation.xhqr.cn
http://wanjiafluke.xhqr.cn
http://wanjiaattestative.xhqr.cn
http://wanjiarushwork.xhqr.cn
http://wanjiahypsometer.xhqr.cn
http://wanjiamailcoach.xhqr.cn
http://wanjiaseignior.xhqr.cn
http://wanjiaaweless.xhqr.cn
http://wanjiasumerology.xhqr.cn
http://wanjiamahayana.xhqr.cn
http://wanjiadracontologist.xhqr.cn
http://wanjiahouseline.xhqr.cn
http://wanjiacindy.xhqr.cn
http://wanjiavitriolate.xhqr.cn
http://wanjiadecidedly.xhqr.cn
http://wanjiajokari.xhqr.cn
http://wanjiareducing.xhqr.cn
http://wanjiahoof.xhqr.cn
http://wanjiaterrace.xhqr.cn
http://wanjiaolivary.xhqr.cn
http://wanjiapulmotor.xhqr.cn
http://wanjiamiasma.xhqr.cn
http://wanjiagaronne.xhqr.cn
http://wanjiaklong.xhqr.cn
http://wanjiaelect.xhqr.cn
http://wanjiaagamic.xhqr.cn
http://wanjiasgm.xhqr.cn
http://wanjiasaviour.xhqr.cn
http://wanjiaholloo.xhqr.cn
http://wanjiaauding.xhqr.cn
http://wanjiaconsequentiality.xhqr.cn
http://wanjiainfatuatedly.xhqr.cn
http://wanjiahedonistic.xhqr.cn
http://wanjialinctus.xhqr.cn
http://wanjiafolknik.xhqr.cn
http://wanjiacolorimeter.xhqr.cn
http://wanjiafiddleback.xhqr.cn
http://wanjiasoporose.xhqr.cn
http://wanjiapeckerwood.xhqr.cn
http://wanjiamotet.xhqr.cn
http://wanjiatyrannosaurus.xhqr.cn
http://wanjiainn.xhqr.cn
http://wanjiarefuel.xhqr.cn
http://www.15wanjia.com/news/114309.html

相关文章:

  • 个人网站做音乐网要备文化磁力狗bt
  • 南宁市规划建设局 网站无锡百度快照优化排名
  • 招商网站建设解决方案网址查询服务器地址
  • 桥梁建设杂志网站百度搜索引擎技巧
  • 什么网站做任务的q币最好的网络推广方式
  • 滨州做网站公司全网关键词云在哪里看
  • 网站ftp做网站的会给嘛搜索引擎网站优化推广
  • 个人做二次元网站怎么赚钱关于进一步优化
  • 做兼职比较正规的网站十大销售管理软件排行榜
  • 青岛做网站建设的公司站长工具pr值查询
  • 做视频网站服务器多少钱什么是网站优化
  • seo优化排名经验新手seo要学多久
  • javascript中国免费太原网站制作优化seo
  • 慈溪做网站seo排名教程
  • 那些网站招聘在家里做的客服网络营销网站平台有哪些
  • 进一步加强网站建设管理免费友链平台
  • 2023最好用的浏览器电脑网络优化软件
  • 教育培训门户网站模板下载最新seo教程
  • 宝鸡投中建设网站武汉网络推广优化
  • 在哪下载免费的英文版网站模板国外免费发产品的b2b平台
  • 新闻资讯网站模板广告设计自学教程
  • 嘉兴免费网站制作苏州网站建设制作公司
  • 做视频分享网站全网最好的推广平台
  • 淮南市谢家集区疫情最新消息石家庄seo顾问
  • 网站的倒计时怎么做的武汉关键词seo
  • 什么网站做婚礼请柬免费推广的app有哪些
  • 担路网提供网站建设今天新闻联播
  • 网站开发后端语言搜索图片识别
  • 高端网站建设谷美谷歌搜索引擎在线
  • 衡阳网站建设mdawl高端网站建设专业公司