支气管炎哮喘吃什么药| 肚子左边是什么器官| 西洋参什么季节吃最好| hpv是什么病毒| 椰浆和椰汁有什么区别| 南瓜和窝瓜有什么区别| alex是什么意思| 中暑是什么意思| 醒酒喝什么饮料| 背包客是什么意思| 吃什么补充维生素d| 头部MRI检查是什么意思| hrd是什么| 卵巢无回声是什么意思| 内蒙古有什么特产| 酸奶坏了是什么味道| 联袂是什么意思| 巴斯光年是什么意思| 梦见好多衣服是什么意思| 什么的肥料| 甲鱼吃什么食物| 麻醉科属于什么科室| 什么的超市| 生灵涂炭是什么意思| 命里有时终须有命里无时莫强求什么意思| 大美女是什么意思| 农历十二月是什么月| 梦见别人家办丧事是什么意思| 3月25号是什么星座| 一冷就咳嗽是什么原因| 什么是膝关节退行性变| 散粉是干什么用的| 氮气是什么| hpv45型阳性是什么意思| 20岁长白头发是什么原因造成的| ppt什么意思| 二月十六是什么星座| 气血不足是什么症状| 瓦特发明了什么| 八月二十五是什么星座| 苦荞是什么植物| 凉面是用什么面做的| 立春是什么时候| 宝宝益生菌什么时候吃最好| 康康是什么意思| 孕妇梦见鬼是什么预兆| 贝贝是什么意思| 什么是ntr| 芥末黄是什么颜色| 大姨妈每个月提前来是什么原因| 白龙马叫什么名字| 宝宝低烧是什么原因引起的| 什么病不能吃松花粉| 射不出来是什么原因| rag是什么| 肌酐清除率是什么意思| 玫瑰花和什么一起泡水喝好| 无为而治是什么意思| 5月15日是什么星座| VH是什么品牌| 什么地望着| 浅表性胃炎吃什么药效果好| 鸡痘用什么药效果好| 肉苁蓉有什么功能| 脱氧核苷酸是什么| nuxe是什么牌子| 鼻子挤出来的白色东西是什么| 金丝檀木是什么木| 精是什么意思| 甲状腺挂什么科| 首肯是什么意思| 东北有什么特产| 新生儿拉肚子是什么原因引起的| 小暑吃什么| 境遇是什么意思| 摊手是什么意思| 火腿肠炒什么好吃| 种植牙有什么风险和后遗症| 疖肿是什么原因引起的| 低骨量是什么意思| 什么是认知障碍| 女人梦见狗是什么预兆| 土贝什么字| 撬墙角是什么意思| 招待是什么意思| 分水岭是什么意思| 睡眠不好挂什么科| 木乐读什么| 梦见车丢了是什么征兆| 为什么有白带| 坐月子吃什么菜| 已归档是什么意思| 得宝松是什么药| 家里来猫是什么征兆| 九点是什么时辰| 每天吃黄瓜有什么好处| 1979年什么命| 黄菡和黄澜什么关系| 泰斗是什么意思| 什么叫智齿牙| 排便困难是什么原因| 疱疹一般长在什么部位| 扁平苔藓是什么原因引起的| 玉米淀粉是什么| 珏字五行属什么| 成年人改名字需要什么手续| rng是什么意思| 探索是什么意思| 2019年什么生肖| 什么是激素类药物| 什么是过敏体质| 欢天喜地是什么生肖| 公务员属于什么行业| 昆明有什么特产| hold不住是什么意思| 牙龈出血缺什么维生素| 什么时候闰五月| 疏通血管吃什么药最好| 脚一直出汗是什么原因| 舒张压偏高是什么原因造成的| MECT是什么| 猴子喜欢吃什么食物| 什么牌子的燃气灶质量好| 复方阿胶浆适合什么样的人喝| 芒果跟什么不能一起吃| 焦油是什么| 重阳节为什么要插茱萸| 飞蚊症用什么眼药水| 天刑是什么意思| xmm是什么意思| 大自然是什么意思| guess什么意思| 盐酸是什么| 平安对什么| 2014年什么年| 空调抽湿是什么意思| 嚼舌根是什么意思| 越南三宝是什么| 手发抖是什么原因引起的年轻人| 安康鱼是什么鱼| ups是什么快递公司| 放疗有什么副作用| 黄瓜又什么又什么| 贫血吃什么补血最快| 惊为天人是什么意思| 秘语是什么意思| 舌吻什么感觉| 金匮肾气丸有什么功效| 梅毒症状男有什么表现| 相对湿度是什么意思| 心悸失眠是什么症状| 碱性是什么意思| 鹦鹉爱吃什么| 吃过饭后就想拉大便是什么原因| 军魂是什么意思| 疯子是什么意思| 善罢甘休的意思是什么| 鸡蛋壳属于什么垃圾| 鼻子挤出来的白色东西是什么| 什么是苏打水| 农历六月初七是什么星座| 五海瘿瘤丸主要治什么病| 6月什么星座| 继承衣钵是什么意思| 7月23是什么星座| 子宫痒是什么原因| 11月7日是什么星座| 卷柏属于什么植物| 顶嘴是什么意思| 西红柿和什么榨汁减肥| 感激涕零什么意思| 63年属什么生肖| 什么时候放假| 狡兔三窟什么意思| 健身rm是什么意思| 排骨炖苦瓜有什么功效| 萎缩性胃炎吃什么药| 随性什么意思| 一直打喷嚏是什么原因| 孕中期失眠是什么原因| 头发秃一块是什么原因| 什么是闺蜜| 屁股痒是什么原因| 同房什么意思| sp是什么意思啊| 排卵是什么| 胃不舒服吃什么药| 158是什么意思| 龟代表什么生肖| 维生素c什么时候吃| 什么食物热量低| 身心疲惫是什么意思| 最好的止疼药是什么药| 作业是什么意思| 肾有问题挂什么科| 冰晶是什么| 女人五行缺水是什么命| 便黑色大便是什么情况| theme什么意思| 吃头孢不能吃什么水果| 杜撰是什么意思| 身体素质是什么意思| 山药煲汤搭配什么好| 鼓上蚤是什么意思| 梦到自己拔牙齿是什么预兆| 月经前乳房胀痛是什么原因| 脑梗前兆是什么症状| 小孩晚上不睡觉是什么原因| 元宵节干什么| 柯基犬为什么要断尾巴| 羊肉和什么菜搭配最好| 一个彭一个瓦念什么| 性是什么| 胰腺炎用什么药| 口干舌燥口苦是什么原因引起的| lmp医学上什么意思| 2157是什么意思| 菊花搭配什么泡茶最好| 见红的血是什么颜色| 女装什么牌子好| simon是什么意思| joeone是什么牌子| 花木兰是什么剧种| 血压低压高吃什么药| 2月19日是什么星座| 伏特加是用什么酿造的| 争议是什么意思| 中国反导弹系统叫什么| 胃灼热烧心吃什么药| 三省吾身是什么意思| 碳14呼气试验阳性是什么意思| 甲状腺手术后有什么后遗症| 阑尾粪石是什么意思| 民政局局长什么级别| 9月3号是什么纪念日| 尿潜血弱阳性是什么意思| 女人为什么会得霉菌| 周二右眼皮跳是什么预兆| 小鸡仔吃什么| 做梦钓到大鱼什么意思| 德艺双馨什么意思| rrl是什么牌子| 什么是二代身份证| 女性尿液发黄是什么原因| 祸不及家人前一句是什么| 打蛋器什么牌子好| 空调感冒吃什么药| 打扰了是什么意思| 994是什么意思| 补睾丸吃什么药最好| 2013年是什么年| 一心一意指什么生肖| 恢弘是什么意思| 脖子短是什么原因| 精油有什么功效| 水果的英文是什么| 孩子营养不良吃什么| 敌是什么生肖| 陆代表什么生肖| 风包念什么| 奶粉二段和三段有什么区别| cj是什么| 柒牌男装什么档次| 小孩眼屎多是什么原因引起的| 百度Jump to content

周刊543:万达居然有数据中心?还要做公有云?

From Wikipedia, the free encyclopedia
百度 美丽宜居村庄:洁、绿、畅、美人在画中游为推动美丽宜居村庄建设,长安区投资亿元,为34个村修缮了村庄规划编制和建设设计,组织实施了道路畅通、危房改造、生活垃圾治理等十大工程,农村人居环境明显改善,一大批曾经脏乱差的村庄变成了水清、路平、灯明、村美的美丽村庄。

In statistical classification, two main approaches are called the generative approach and the discriminative approach. These compute classifiers by different approaches, differing in the degree of statistical modelling. Terminology is inconsistent,[a] but three major types can be distinguished:[1]

  1. A generative model is a statistical model of the joint probability distribution on a given observable variable X and target variable Y;[2] A generative model can be used to "generate" random instances (outcomes) of an observation x.[3]
  2. A discriminative model is a model of the conditional probability of the target Y, given an observation x. It can be used to "discriminate" the value of the target variable Y, given an observation x.[4]
  3. Classifiers computed without using a probability model are also referred to loosely as "discriminative".

The distinction between these last two classes is not consistently made;[5] Jebara (2004) refers to these three classes as generative learning, conditional learning, and discriminative learning, but Ng & Jordan (2002) only distinguish two classes, calling them generative classifiers (joint distribution) and discriminative classifiers (conditional distribution or no distribution), not distinguishing between the latter two classes.[6] Analogously, a classifier based on a generative model is a generative classifier, while a classifier based on a discriminative model is a discriminative classifier, though this term also refers to classifiers that are not based on a model.

Standard examples of each, all of which are linear classifiers, are:

In application to classification, one wishes to go from an observation x to a label y (or probability distribution on labels). One can compute this directly, without using a probability distribution (distribution-free classifier); one can estimate the probability of a label given an observation, (discriminative model), and base classification on that; or one can estimate the joint distribution (generative model), from that compute the conditional probability , and then base classification on that. These are increasingly indirect, but increasingly probabilistic, allowing more domain knowledge and probability theory to be applied. In practice different approaches are used, depending on the particular problem, and hybrids can combine strengths of multiple approaches.

Definition

[edit]

An alternative division defines these symmetrically as:

  • a generative model is a model of the conditional probability of the observable X, given a target y, symbolically, [3]
  • a discriminative model is a model of the conditional probability of the target Y, given an observation x, symbolically, [4]

Regardless of precise definition, the terminology is constitutional because a generative model can be used to "generate" random instances (outcomes), either of an observation and target , or of an observation x given a target value y,[3] while a discriminative model or discriminative classifier (without a model) can be used to "discriminate" the value of the target variable Y, given an observation x.[4] The difference between "discriminate" (distinguish) and "classify" is subtle, and these are not consistently distinguished. (The term "discriminative classifier" becomes a pleonasm when "discrimination" is equivalent to "classification".)

The term "generative model" is also used to describe models that generate instances of output variables in a way that has no clear relationship to probability distributions over potential samples of input variables. Generative adversarial networks are examples of this class of generative models, and are judged primarily by the similarity of particular outputs to potential inputs. Such models are not classifiers.

Relationships between models

[edit]

In application to classification, the observable X is frequently a continuous variable, the target Y is generally a discrete variable consisting of a finite set of labels, and the conditional probability can also be interpreted as a (non-deterministic) target function , considering X as inputs and Y as outputs.

Given a finite set of labels, the two definitions of "generative model" are closely related. A model of the conditional distribution is a model of the distribution of each label, and a model of the joint distribution is equivalent to a model of the distribution of label values , together with the distribution of observations given a label, ; symbolically, Thus, while a model of the joint probability distribution is more informative than a model of the distribution of label (but without their relative frequencies), it is a relatively small step, hence these are not always distinguished.

Given a model of the joint distribution, , the distribution of the individual variables can be computed as the marginal distributions and (considering X as continuous, hence integrating over it, and Y as discrete, hence summing over it), and either conditional distribution can be computed from the definition of conditional probability: and .

Given a model of one conditional probability, and estimated probability distributions for the variables X and Y, denoted and , one can estimate the opposite conditional probability using Bayes' rule:

For example, given a generative model for , one can estimate:

and given a discriminative model for , one can estimate:

Note that Bayes' rule (computing one conditional probability in terms of the other) and the definition of conditional probability (computing conditional probability in terms of the joint distribution) are frequently conflated as well.

Contrast with discriminative classifiers

[edit]

A generative algorithm models how the data was generated in order to categorize a signal. It asks the question: based on my generation assumptions, which category is most likely to generate this signal? A discriminative algorithm does not care about how the data was generated, it simply categorizes a given signal. So, discriminative algorithms try to learn directly from the data and then try to classify data. On the other hand, generative algorithms try to learn which can be transformed into later to classify the data. One of the advantages of generative algorithms is that you can use to generate new data similar to existing data. On the other hand, it has been proved that some discriminative algorithms give better performance than some generative algorithms in classification tasks.[7]

Despite the fact that discriminative models do not need to model the distribution of the observed variables, they cannot generally express complex relationships between the observed and target variables. But in general, they don't necessarily perform better than generative models at classification and regression tasks. The two classes are seen as complementary or as different views of the same procedure.[8]

Deep generative models

[edit]

With the rise of deep learning, a new family of methods, called deep generative models (DGMs),[9][10] is formed through the combination of generative models and deep neural networks. An increase in the scale of the neural networks is typically accompanied by an increase in the scale of the training data, both of which are required for good performance.[11]

Popular DGMs include variational autoencoders (VAEs), generative adversarial networks (GANs), and auto-regressive models. Recently, there has been a trend to build very large deep generative models.[9] For example, GPT-3, and its precursor GPT-2,[12] are auto-regressive neural language models that contain billions of parameters, BigGAN[13] and VQ-VAE[14] which are used for image generation that can have hundreds of millions of parameters, and Jukebox is a very large generative model for musical audio that contains billions of parameters.[15]

Types

[edit]

Generative models

[edit]

Types of generative models are:

If the observed data are truly sampled from the generative model, then fitting the parameters of the generative model to maximize the data likelihood is a common method. However, since most statistical models are only approximations to the true distribution, if the model's application is to infer about a subset of variables conditional on known values of others, then it can be argued that the approximation makes more assumptions than are necessary to solve the problem at hand. In such cases, it can be more accurate to model the conditional density functions directly using a discriminative model (see below), although application-specific details will ultimately dictate which approach is most suitable in any particular case.

Discriminative models

[edit]

Examples

[edit]

Simple example

[edit]

Suppose the input data is , the set of labels for is , and there are the following 4 data points:

For the above data, estimating the joint probability distribution from the empirical measure will be the following:

while will be following:

Text generation

[edit]

Shannon (1948) gives an example in which a table of frequencies of English word pairs is used to generate a sentence beginning with "representing and speedily is an good"; which is not proper English but which will increasingly approximate it as the table is moved from word pairs to word triplets etc.

See also

[edit]

Notes

[edit]
  1. ^ Three leading sources, Ng & Jordan 2002, Jebara 2004, and Mitchell 2015, give different divisions and definitions.

References

[edit]
  1. ^ Jebara, Tony (2004). Machine Learning: Discriminative and Generative. The Springer International Series in Engineering and Computer Science. Kluwer Academic (Springer). ISBN 978-1-4020-7647-3.
  2. ^ Ng & Jordan (2002): "Generative classifiers learn a model of the joint probability, , of the inputs x and the label y, and make their predictions by using Bayes rules to calculate , and then picking the most likely label y.
  3. ^ a b c Mitchell 2015: "We can use Bayes rule as the basis for designing learning algorithms (function approximators), as follows: Given that we wish to learn some target function , or equivalently, , we use the training data to learn estimates of and . New X examples can then be classified using these estimated probability distributions, plus Bayes rule. This type of classifier is called a generative classifier, because we can view the distribution as describing how to generate random instances X conditioned on the target attribute Y.
  4. ^ a b c Mitchell 2015: "Logistic Regression is a function approximation algorithm that uses training data to directly estimate , in contrast to Naive Bayes. In this sense, Logistic Regression is often referred to as a discriminative classifier because we can view the distribution as directly discriminating the value of the target value Y for any given instance X
  5. ^ Jebara 2004, 2.4 Discriminative Learning: "This distinction between conditional learning and discriminative learning is not currently a well-established convention in the field."
  6. ^ Ng & Jordan 2002: "Discriminative classifiers model the posterior directly, or learn a direct map from inputs x to the class labels."
  7. ^ Ng & Jordan 2002
  8. ^ Bishop, C. M.; Lasserre, J. (24 September 2007), "Generative or Discriminative? getting the best of both worlds", in Bernardo, J. M. (ed.), Bayesian statistics 8: proceedings of the eighth Valencia International Meeting, June 2-6, 2006, Oxford University Press, pp. 3–23, ISBN 978-0-19-921465-5
  9. ^ a b "Scaling up—researchers advance large-scale deep generative models". Microsoft. April 9, 2020.
  10. ^ "Generative Models". OpenAI. June 16, 2016.
  11. ^ Kaplan, Jared; McCandlish, Sam; Henighan, Tom; Brown, Tom B.; Chess, Benjamin; Child, Rewon; Gray, Scott; Radford, Alec; Wu, Jeffrey; Amodei, Dario (2020). "Scaling Laws for Neural Language Models". arXiv:2001.08361 [stat.ML].
  12. ^ "Better Language Models and Their Implications". OpenAI. February 14, 2019.
  13. ^ Brock, Andrew; Donahue, Jeff; Simonyan, Karen (2018). "Large Scale GAN Training for High Fidelity Natural Image Synthesis". arXiv:1809.11096 [cs.LG].
  14. ^ Razavi, Ali; van den Oord, Aaron; Vinyals, Oriol (2019). "Generating Diverse High-Fidelity Images with VQ-VAE-2". arXiv:1906.00446 [cs.LG].
  15. ^ "Jukebox". OpenAI. April 30, 2020.
[edit]
为什么老打哈欠 痛风什么引起的原因有哪些 手上的纹路代表什么 脚腕酸是什么原因 石女是什么意思
膀胱过度活动症吃什么药 血液属于什么组织 出气臭是什么原因 睾丸疝气有什么症状 老登是什么意思
肾虚是什么意思 元曲是什么意思 喝中药为什么会拉肚子 霜降出什么生肖 复活节是什么意思
君子兰什么时候开花 配偶什么意思 trab是甲状腺什么指标 血脂高吃什么油好 马来西亚说什么语言
十余载是什么意思hcv7jop7ns3r.cn 梦见鱼是什么意思hcv8jop8ns7r.cn 什么是全麦面粉hcv8jop9ns4r.cn 蘑菇和什么不能一起吃hcv8jop5ns6r.cn 胆囊炎吃什么好hcv8jop3ns8r.cn
第二性征是什么意思hcv8jop4ns8r.cn 花生碎能做什么食物吃hcv9jop6ns1r.cn cta是什么检查tiangongnft.com 百白破是预防什么的hcv9jop3ns5r.cn crocodile是什么牌子hcv9jop3ns7r.cn
火气重喝什么茶hcv9jop3ns6r.cn 挫伤是什么意思clwhiglsz.com 脑子萎缩是什么原因造成的hcv9jop6ns1r.cn 美业是做什么的mmeoe.com 尿酸高是什么症状hcv7jop9ns8r.cn
野生型是什么意思hcv9jop1ns8r.cn 空气净化器什么牌子好wmyky.com 生肖兔和什么生肖相冲hcv7jop9ns2r.cn 扎手指放血治什么hcv8jop0ns8r.cn 女生无缘无故头疼是什么原因jinxinzhichuang.com
百度