2.3是什么星座| 爱放屁是什么原因| 狗皮肤溃烂用什么药| 吃什么东西化痰| 霉菌性阴道炎吃什么消炎药| 眼睛干涩用什么药效果好| 乙肝两对半145阳性是什么意思| his系统是什么| 鸡飞狗跳是指什么生肖| 2月11日什么星座| 恭喜恭喜是什么意思| 为什么睡觉后鱼刺没了| 肛痈是什么病| 菱形脸适合什么刘海| 人突然消瘦是什么原因| 黄瓜生吃有什么好处| 挂号特需是什么意思| 黑头是什么| 藠头是什么菜| 甲状腺tsh高说明什么| 同好是什么意思| 孔雀开屏是什么意思| 小麻雀吃什么| 迪丽热巴颜值什么水平| 短发女人吸引什么男人| 龙鱼吃什么| 紫微星是什么意思| 酸中毒是什么意思| ca是什么病| 戒指戴左手食指是什么意思| 精神食粮是什么意思| 经信委是干什么的| 血糖高吃什么中药好| vj是什么意思| 七叶一枝花主治什么病| 尿检隐血弱阳性是什么意思| 空白是什么意思| 海底椰是什么东西| 间歇性是什么意思| 痔疮是什么感觉| 女生有喉结是什么原因| 肾亏是什么意思| mas是什么意思| c2是什么| 怀孕喝什么牛奶好| 失眠有什么办法解决| 梦见烧火是什么意思| 便秘吃什么药快速排便| 1007是什么星座| 肝浸润是什么意思| 三醋酯纤维是什么面料| 寒湿重吃什么中成药| 女的肾虚是什么原因引起的| 努尔哈赤是什么意思| 危险期是什么时候| 今夕何夕是什么意思| 聪明如你什么意思| 经期喝什么茶好| 血压低吃什么补得最快| 干燥综合症吃什么药| 什么是焦虑| 宋江是什么生肖| 槟榔是什么味道| 辣椒红是什么| 跳蚤喜欢咬什么样的人| 什么利尿| 心律失常吃什么药| 毫不逊色的意思是什么| 射手女喜欢什么样的男生| 门字五行属什么| 松果体囊肿是什么病| 鸡吃什么| 内在美是什么意思| 青春期什么时候结束| 含义是什么意思| 什么虎不吃人| ihc是什么意思| 什么地赞叹| 什么人容易得妄想症| 授人以鱼不如授人以渔什么意思| 甜菜根在中国叫什么| 得道是什么意思| 左下腹疼痛挂什么科| 金国是现在的什么地方| 月经失调是什么原因引起的| 左胸隐痛什么原因| 心慌胸闷是什么原因| 口气重吃什么药效果好| infp是什么意思| 后背凉凉的是什么原因| hhv是什么病毒| 头疼发热是什么原因| 家庭是什么| 如字五行属什么| bv中间型是什么意思| 菊花配枸杞什么功效| 25分贝相当于什么声音| 吃黄精有什么好处| 龟皮痒用什么药膏| 母婴传播是什么意思| nac是什么| 什么样的大地| 抗体是指什么| 七点到九点是什么时辰| 白衬衫太透里面穿什么| 练八段锦有什么好处| 烂尾楼是什么意思| 移动增值业务费是什么| 核心抗体阳性说明什么| 甲功三项是检查什么| 莲叶和荷叶有什么区别| 什么是条件反射| 洁字五行属什么| 梦到谈恋爱预示着什么| 早上空腹干呕什么原因| 出差带什么| 胃不舒服吃什么食物好| 虚张声势是什么生肖| 感冒流黄鼻涕吃什么药| 什么什么分明的成语| 这是什么| 5.23是什么星座| 晚上喝蜂蜜水有什么好处| 巨石强森是什么人种| 血止不住是什么原因| 四面楚歌是什么意思| 女人背心正中间疼是什么原因| 肾尿盐结晶是什么意思| 额头出汗是什么原因| 西晋之后是什么朝代| 乙肝有什么症状| 什么自语| 运动后想吐是什么原因| 乙肝两对半45阳性是什么意思| 耳朵痒是什么原因| 什么的味道| 共青团书记是什么级别| 女人40不惑什么意思| 高温天气喝什么茶解暑| 月经少吃什么好排血多| 韬光养晦下一句是什么| 鼻子下面长痘痘是什么原因引起的| 原来是什么意思| 什么叫佛| 嗓子疼是什么原因| 去心火喝什么茶好| 子宫b超能查出什么来| 张属于五行属什么| 血糖偏高会有什么症状| 淋巴细胞绝对值偏高是什么原因| 肚脐眼连着什么器官| 松花粉对肝有什么好处| p代表什么| 装什么病能容易开病假| 芒果和什么不能一起吃| 孕囊是什么意思| 淋证是什么病| 高血糖挂什么科室的号| 圆明园是什么时候被烧的| 为什么人死后要盖住脸| 下巴有痣代表什么| 什么原因引起高血压| 眉毛上长痘是什么原因| 疱疹性咽峡炎吃什么食物| gif是什么意思| 气血不足吃什么补得快| 1960属什么生肖| 心脏早搏有什么危害| 尹是什么意思| 妇科彩超主要检查什么| 提篮子是什么意思| 流注是什么意思| 欢愉是什么意思| 佑是什么意思| 小孩上户口需要什么材料| 水杨酸是什么| 经常挖鼻孔有什么危害| 萘普生是什么药| 身体缺酶会得什么病| 子宫肌瘤手术后吃什么好| 袋鼠吃什么| 什么水果最贵| 吃什么水果对肠胃好| 滋阴润燥是什么意思| 笑面虎比喻什么样的人| 拔牙有什么危害| 脚板痒是什么原因| 拘留是什么意思| 雌激素低有什么症状| 燕窝是什么| 七夕送什么| 排山倒海是什么意思| tga是什么意思| 老婆妹妹叫什么| 男生来大姨夫是什么意思| hco3-是什么意思| 吕布的武器叫什么| 天秤座是什么性格| 梦见发面是什么意思| 屁股上长痘是什么原因| 上火吃什么水果| ag是什么意思| 白菜什么时候播种| 一切尽在不言中什么意思| 苹果醋有什么功效| 小便多吃什么药| 为什么会长痣| 梦见请别人吃饭是什么意思| 鸡胗炒什么菜好吃| 牙痛是什么原因引起的| 吹毛求疵什么意思| 91年出生属什么生肖| 肋骨下面疼是什么原因| 金福是什么生肖| wink是什么意思| 出马什么意思| 为什么叫梅雨季节| 乳癖是什么病| 头发容易断是什么原因| 汁字五行属什么| 福生无量天尊什么意思| 肺实变是什么意思| 精神病的前兆是什么| 牙龈萎缩 用什么牙膏好| 6月是什么生肖| 三月14号是什么星座| 子欲养而亲不待什么意思| 显赫是什么意思| 肺部增殖灶是什么意思| 灰姑娘叫什么名字| 头发掉得厉害是什么原因| 感冒吃什么药| 犹太人是什么人| 乙醇对人体有什么伤害| 祖字五行属什么| 胚胎停育有什么症状| 吃过期药有什么危害| 扁桃体发炎吃什么药好得快| 氨基酸态氮是什么| 周瑜属什么生肖| 广州有什么好吃的| 尿量变少是什么原因| 喉咙痛上火吃什么药效果最好| 睡觉腿抽筋是什么原因| 腹部增强ct能检查出什么| 狗脚朕什么意思| 头晕挂什么科比较好| 黑色裤子配什么颜色t恤| 水代表什么| 煞是什么意思| 字义是什么意思| 背疼应该挂什么科| 女人每天喝豆浆有什么好处| 手的皮肤黄是什么原因| s 是什么意思| 卵泡破裂有什么症状| 女人吃善存有什么好处| 舌头尖疼吃什么药| 做梦抓鱼什么意思周公解梦| 春砂仁与什么煲汤最佳| 梦见吵架是什么预兆| 1950年属虎的是什么命| 农历五月是什么星座| 记性差是什么原因| 百度Jump to content

事业单位职工注意!你的编制、待遇和晋升又有了新变化

From Wikipedia, the free encyclopedia
百度 女双比赛中,国乒本来有3对球员报名,冯亚兰/木子、孙铭阳/张蔷因为冯亚兰和孙铭阳退赛整体退赛,剩下的独苗李佳燚/文佳压根没有获得参赛资格!

Topological deep learning (TDL)[1][2][3][4][5][6] is a research field that extends deep learning to handle complex, non-Euclidean data structures. Traditional deep learning models, such as convolutional neural networks (CNNs) and recurrent neural networks (RNNs), excel in processing data on regular grids and sequences. However, scientific and real-world data often exhibit more intricate data domains encountered in scientific computations , including point clouds, meshes, time series, scalar fields graphs, or general topological spaces like simplicial complexes and CW complexes.[7] TDL addresses this by incorporating topological concepts to process data with higher-order relationships, such as interactions among multiple entities and complex hierarchies. This approach leverages structures like simplicial complexes and hypergraphs to capture global dependencies and qualitative spatial properties, offering a more nuanced representation of data. TDL also encompasses methods from computational and algebraic topology that permit studying properties of neural networks and their training process, such as their predictive performance or generalization properties.[8][9][10][11][12][13][14] The mathematical foundations of TDL are algebraic topology, differential topology, and geometric topology. Therefore, TDL can be generalized for data on differentiable manifolds, knots, links, tangles, curves, etc.

History and motivation

[edit]

Traditional techniques from deep learning often operate under the assumption that a dataset is residing in a highly-structured space (like images, where convolutional neural networks exhibit outstanding performance over alternative methods) or a Euclidean space. The prevalence of new types of data, in particular graphs, meshes, and molecules, resulted in the development of new techniques, culminating in the field of geometric deep learning, which originally proposed a signal-processing perspective for treating such data types.[15] While originally confined to graphs, where connectivity is defined based on nodes and edges, follow-up work extended concepts to a larger variety of data types, including simplicial complexes[16][3] and CW complexes,[8][17] with recent work proposing a unified perspective of message-passing on general combinatorial complexes.[1]

An independent perspective on different types of data originated from topological data analysis, which proposed a new framework for describing structural information of data, i.e., their "shape," that is inherently aware of multiple scales in data, ranging from local information to global information.[18] While at first restricted to smaller datasets, subsequent work developed new descriptors that efficiently summarized topological information of datasets to make them available for traditional machine-learning techniques, such as support vector machines or random forests. Such descriptors ranged from new techniques for feature engineering over new ways of providing suitable coordinates for topological descriptors,[19][20][21] or the creation of more efficient dissimilarity measures.[22][23][24][25]

Contemporary research in this field is largely concerned with either integrating information about the underlying data topology into existing deep-learning models or obtaining novel ways of training on topological domains.

Learning on topological spaces

[edit]
Learning Tasks on topological domains can be broadly classified into three categories: cell classification, cell prediction and complex classification.[1]

Focusing on topology in the sense of point set topology, an active branch of TDL is concerned with learning on topological spaces, that is, on different topological domains.

An introduction to topological domains

[edit]

One of the core concepts in topological deep learning is the domain upon which this data is defined and supported. In case of Euclidean data, such as images, this domain is a grid, upon which the pixel value of the image is supported. In a more general setting this domain might be a topological domain. Next, we introduce the most common topological domains that are encountered in a deep learning setting. These domains include, but not limited to, graphs, simplicial complexes, cell complexes, combinatorial complexes and hypergraphs.

Given a finite set S of abstract entities, a neighborhood function on S is an assignment that attach to every point in S a subset of S or a relation. Such a function can be induced by equipping S with an auxiliary structure. Edges provide one way of defining relations among the entities of S. More specifically, edges in a graph allow one to define the notion of neighborhood using, for instance, the one hop neighborhood notion. Edges however, limited in their modeling capacity as they can only be used to model binary relations among entities of S since every edge is connected typically to two entities. In many applications, it is desirable to permit relations that incorporate more than two entities. The idea of using relations that involve more than two entities is central to topological domains. Such higher-order relations allow for a broader range of neighborhood functions to be defined on S to capture multi-way interactions among entities of S.

Next we review the main properties, advantages, and disadvantages of some commonly studied topological domains in the context of deep learning, including (abstract) simplicial complexes, regular cell complexes, hypergraphs, and combinatorial complexes.

(a): A group S is made up of basic parts (vertices) without any connections.(b): A graph represents simple connections between its parts (vertices) that are elements of S.(c): A simplicial complex shows a way parts (relations) are connected to each other, but with strict rules about how they're connected.(d): Like simplicial complexes, a cell complex shows how parts (relations) are connected, but it's more flexible in how they're shaped (like 'cells').(f): A hypergraph shows any kind of connections between parts of S, but these connections aren't organized in any particular order.(e): A CC mixes elements from cell complexes (connections with order) and hypergraphs (varied connections), covering both kinds of setups.[1]

Comparisons among topological domains

[edit]

Each of the enumerated topological domains has its own characteristics, advantages, and limitations:

  • Simplicial complexes
    • Simplest form of higher-order domains.
    • Extensions of graph-based models.
    • Admit hierarchical structures, making them suitable for various applications.
    • Hodge theory can be naturally defined on simplicial complexes.
    • Require relations to be subsets of larger relations, imposing constraints on the structure.
  • Cell Complexes
    • Generalize simplicial complexes.
    • Provide more flexibility in defining higher-order relations.
    • Each cell in a cell complex is homeomorphic to an open ball, attached together via attaching maps.
    • Boundary cells of each cell in a cell complex are also cells in the complex.
    • Represented combinatorially via incidence matrices.
  • Hypergraphs
    • Allow arbitrary set-type relations among entities.
    • Relations are not imposed by other relations, providing more flexibility.
    • Do not explicitly encode the dimension of cells or relations.
    • Useful when relations in the data do not adhere to constraints imposed by other models like simplicial and cell complexes.
  • Combinatorial Complexes[1] :
    • Generalize and bridge the gaps between simplicial complexes, cell complexes, and hypergraphs.
    • Allow for hierarchical structures and set-type relations.
    • Combine features of other complexes while providing more flexibility in modeling relations.
    • Can be represented combinatorially, similar to cell complexes.

Hierarchical structure and set-type relations

[edit]

The properties of simplicial complexes, cell complexes, and hypergraphs give rise to two main features of relations on higher-order domains, namely hierarchies of relations and set-type relations.[1]

Rank function
[edit]

A rank function on a higher-order domain X is an order-preserving function rk: XZ, where rk(x) attaches a non-negative integer value to each relation x in X, preserving set inclusion in X. Cell and simplicial complexes are common examples of higher-order domains equipped with rank functions and therefore with hierarchies of relations.[1]

Set-type relations
[edit]

Relations in a higher-order domain are called set-type relations if the existence of a relation is not implied by another relation in the domain. Hypergraphs constitute examples of higher-order domains equipped with set-type relations. Given the modeling limitations of simplicial complexes, cell complexes, and hypergraphs, we develop the combinatorial complex, a higher-order domain that features both hierarchies of relations and set-type relations.[1]

The learning tasks in TDL can be broadly classified into three categories:[1]

  • Cell classification: Predict targets for each cell in a complex. Examples include triangular mesh segmentation, where the task is to predict the class of each face or edge in a given mesh.
  • Complex classification: Predict targets for an entire complex. For example, predict the class of each input mesh.
  • Cell prediction: Predict properties of cell-cell interactions in a complex, and in some cases, predict whether a cell exists in the complex. An example is the prediction of linkages among entities in hyperedges of a hypergraph.

In practice, to perform the aforementioned tasks, deep learning models designed for specific topological spaces must be constructed and implemented. These models, known as topological neural networks, are tailored to operate effectively within these spaces.

Topological neural networks

[edit]

Central to TDL are topological neural networks (TNNs), specialized architectures designed to operate on data structured in topological domains.[2][1] Unlike traditional neural networks tailored for grid-like structures, TNNs are adept at handling more intricate data representations, such as graphs, simplicial complexes, and cell complexes. By harnessing the inherent topology of the data, TNNs can capture both local and global relationships, enabling nuanced analysis and interpretation.

Message passing topological neural networks

[edit]

In a general topological domain, higher-order message passing involves exchanging messages among entities and cells using a set of neighborhood functions.

Definition: Higher-Order Message Passing on a General Topological Domain

Higher order message passing is a deep learning model defined on a topological domain and relies on message passing information among entities in the underlying domain in order to perform a learning task.[1]

Let be a topological domain. We define a set of neighborhood functions on . Consider a cell and let for some . A message between cells and is a computation dependent on these two cells or the data supported on them. Denote as the multi-set , and let represent some data supported on cell at layer . Higher-order message passing on ,[1][8] induced by , is defined by the following four update rules:

  1. , where is the intra-neighborhood aggregation function.
  2. , where is the inter-neighborhood aggregation function.
  3. , where are differentiable functions.

Some remarks on Definition above are as follows.

First, Equation 1 describes how messages are computed between cells and . The message is influenced by both the data and associated with cells and , respectively. Additionally, it incorporates characteristics specific to the cells themselves, such as orientation in the case of cell complexes. This allows for a richer representation of spatial relationships compared to traditional graph-based message passing frameworks.

Second, Equation 2 defines how messages from neighboring cells are aggregated within each neighborhood. The function aggregates these messages, allowing information to be exchanged effectively between adjacent cells within the same neighborhood.

Third, Equation 3 outlines the process of combining messages from different neighborhoods. The function aggregates messages across various neighborhoods, facilitating communication between cells that may not be directly connected but share common neighborhood relationships.

Fourth, Equation 4 specifies how the aggregated messages influence the state of a cell in the next layer. Here, the function updates the state of cell based on its current state and the aggregated message obtained from neighboring cells.

Non-message passing topological neural networks

[edit]

While the majority of TNNs follow the message passing paradigm from graph learning, several models have been suggested that do not follow this approach. For instance, Maggs et al.[26] leverage geometric information from embedded simplicial complexes, i.e., simplicial complexes with high-dimensional features attached to their vertices.This offers interpretability and geometric consistency without relying on message passing. Furthermore, in [27] a contrastive loss-based method was suggested to learn the simplicial representation.

Learning on topological descriptors

[edit]

Motivated by the modular nature of deep neural networks, initial work in TDL drew inspiration from topological data analysis, and aimed to make the resulting descriptors amenable to integration into deep-learning models. This led to work defining new layers for deep neural networks. Pioneering work by Hofer et al.,[28] for instance, introduced a layer that permitted topological descriptors like persistence diagrams or persistence barcodes to be integrated into a deep neural network. This was achieved by means of end-to-end-trainable projection functions, permitting topological features to be used to solve shape classification tasks, for instance. Follow-up work expanded more on the theoretical properties of such descriptors and integrated them into the field of representation learning.[29] Other such topological layers include layers based on extended persistent homology descriptors,[30] persistence landscapes,[31] or coordinate functions.[32] In parallel, persistent homology also found applications in graph-learning tasks. Noteworthy examples include new algorithms for learning task-specific filtration functions for graph classification or node classification tasks.[33][34][35]

Applications

[edit]

TDL is rapidly finding new applications across different domains, including data compression,[36] enhancing the expressivity and predictive performance of graph neural networks,[16][17][33] action recognition,[37] and trajectory prediction.[38]

References

[edit]
  1. ^ a b c d e f g h i j k l Hajij, M.; Zamzmi, G.; Papamarkou, T.; Miolane, N.; Guzmán-Sáenz, A.; Ramamurthy, K. N.; Schaub, M. T. (2022), Topological deep learning: Going beyond graph data, arXiv:2206.00606
  2. ^ a b Papillon, M.; Sanborn, S.; Hajij, M.; Miolane, N. (2023). "Architectures of topological deep learning: A survey on topological neural networks". arXiv:2304.10031 [cs.LG].
  3. ^ a b Ebli, S.; Defferrard, M.; Spreemann, G. (2020), Simplicial neural networks, arXiv:2010.03633
  4. ^ Battiloro, C.; Testa, L.; Giusti, L.; Sardellitti, S.; Di Lorenzo, P.; Barbarossa, S. (2023), Generalized simplicial attention neural networks, arXiv:2309.02138
  5. ^ Yang, M.; Isufi, E. (2023), Convolutional learning on simplicial complexes, arXiv:2301.11163
  6. ^ Chen, Y.; Gel, Y. R.; Poor, H. V. (2022), "BScNets: Block Simplicial Complex Neural Networks", Proceedings of the AAAI Conference on Artificial Intelligence, 36 (6): 6333–6341, arXiv:2112.06826, doi:10.1609/aaai.v36i6.20583
  7. ^ Uray, Martin; Giunti, Barbara; Kerber, Michael; Huber, Stefan (2025-08-05). "Topological Data Analysis in smart manufacturing: State of the art and future directions". Journal of Manufacturing Systems. 76: 75–91. arXiv:2310.09319. doi:10.1016/j.jmsy.2024.07.006. ISSN 0278-6125.
  8. ^ a b c Hajij, M.; Istvan, K.; Zamzmi, G. (2020), Cell complex neural networks, arXiv:2010.00743
  9. ^ Bianchini, Monica; Scarselli, Franco (2014). "On the Complexity of Neural Network Classifiers: A Comparison Between Shallow and Deep Architectures". IEEE Transactions on Neural Networks and Learning Systems. 25 (8): 1553–1565. doi:10.1109/TNNLS.2013.2293637. ISSN 2162-237X. PMID 25050951.
  10. ^ Naitzat, Gregory; Zhitnikov, Andrey; Lim, Lek-Heng (2020). "Topology of Deep Neural Networks" (PDF). Journal of Machine Learning Research. 21 (1): 184:7503–184:7542. ISSN 1532-4435.
  11. ^ Birdal, Tolga; Lou, Aaron; Guibas, Leonidas J; Simsekli, Umut (2021). "Intrinsic Dimension, Persistent Homology and Generalization in Neural Networks". Advances in Neural Information Processing Systems. 34. Curran Associates, Inc.: 6776–6789. arXiv:2111.13171.
  12. ^ Ballester, Rubén; Clemente, Xavier Arnal; Casacuberta, Carles; Madadi, Meysam; Corneanu, Ciprian A.; Escalera, Sergio (2024). "Predicting the generalization gap in neural networks using topological data analysis". Neurocomputing. 596: 127787. arXiv:2203.12330. doi:10.1016/j.neucom.2024.127787.
  13. ^ Rieck, Bastian; Togninalli, Matteo; Bock, Christian; Moor, Michael; Horn, Max; Gumbsch, Thomas; Borgwardt, Karsten (2025-08-05). "Neural Persistence: A Complexity Measure for Deep Neural Networks Using Algebraic Topology". International Conference on Learning Representations. 8: 6215–6239. arXiv:1812.09764. doi:10.3929/ethz-b-000327207. ISBN 978-1-7138-7273-3.
  14. ^ Dupuis, Benjamin; Deligiannidis, George; Simsekli, Umut (2025-08-05). "Generalization Bounds using Data-Dependent Fractal Dimensions". Proceedings of the 40th International Conference on Machine Learning. PMLR: 8922–8968.
  15. ^ Bronstein, Michael M.; Bruna, Joan; LeCun, Yann; Szlam, Arthur; Vandergheynst, Pierre (2017). "Geometric Deep Learning: Going beyond Euclidean data". IEEE Signal Processing Magazine. 34 (4): 18–42. arXiv:1611.08097. Bibcode:2017ISPM...34...18B. doi:10.1109/MSP.2017.2693418. ISSN 1053-5888.
  16. ^ a b Bodnar, Cristian; Frasca, Fabrizio; Wang, Yuguang; Otter, Nina; Montufar, Guido F.; Lió, Pietro; Bronstein, Michael (2025-08-05). "Weisfeiler and Lehman Go Topological: Message Passing Simplicial Networks". Proceedings of the 38th International Conference on Machine Learning. PMLR: 1026–1037. arXiv:2103.03212.
  17. ^ a b Bodnar, Cristian; Frasca, Fabrizio; Otter, Nina; Wang, Yuguang; Liò, Pietro; Montufar, Guido F; Bronstein, Michael (2021). "Weisfeiler and Lehman Go Cellular: CW Networks". Advances in Neural Information Processing Systems. 34. Curran Associates, Inc.: 2625–2640. arXiv:2106.12575.
  18. ^ Carlsson, Gunnar (2025-08-05). "Topology and data". Bulletin of the American Mathematical Society. 46 (2): 255–308. doi:10.1090/S0273-2025-08-05249-X. ISSN 0273-0979.
  19. ^ Adcock, Aaron; Carlsson, Erik; Carlsson, Gunnar (2016). "The ring of algebraic functions on persistence bar codes". Homology, Homotopy and Applications. 18 (1): 381–402. arXiv:1304.0530. doi:10.4310/HHA.2016.v18.n1.a21.
  20. ^ Adams, Henry; Emerson, Tegan; Kirby, Michael; Neville, Rachel; Peterson, Chris; Shipman, Patrick; Chepushtanova, Sofya; Hanson, Eric; Motta, Francis; Ziegelmeier, Lori (2017). "Persistence Images: A Stable Vector Representation of Persistent Homology". Journal of Machine Learning Research. 18 (8): 1–35. ISSN 1533-7928.
  21. ^ Bubenik, Peter (2015). "Statistical Topological Data Analysis using Persistence Landscapes". Journal of Machine Learning Research. 16 (3): 77–102. ISSN 1533-7928.
  22. ^ Kwitt, Roland; Huber, Stefan; Niethammer, Marc; Lin, Weili; Bauer, Ulrich (2015). "Statistical Topological Data Analysis - A Kernel Perspective". Advances in Neural Information Processing Systems. 28. Curran Associates, Inc.
  23. ^ Carrière, Mathieu; Cuturi, Marco; Oudot, Steve (2025-08-05). "Sliced Wasserstein Kernel for Persistence Diagrams". Proceedings of the 34th International Conference on Machine Learning. PMLR: 664–673. arXiv:1706.03358.
  24. ^ Kusano, Genki; Fukumizu, Kenji; Hiraoka, Yasuaki (2018). "Kernel Method for Persistence Diagrams via Kernel Embedding and Weight Factor". Journal of Machine Learning Research. 18 (189): 1–41. arXiv:1706.03472. ISSN 1533-7928.
  25. ^ Le, Tam; Yamada, Makoto (2018). "Persistence Fisher Kernel: A Riemannian Manifold Kernel for Persistence Diagrams". Advances in Neural Information Processing Systems. 31. Curran Associates, Inc. arXiv:1802.03569.
  26. ^ Maggs, Kelly; Hacker, Celia; Rieck, Bastian (2025-08-05). "Simplicial Representation Learning with Neural k-Forms". International Conference on Learning Representations. arXiv:2312.08515.
  27. ^ Ramamurthy, K. N.; Guzmán-Sáenz, A.; Hajij, M. (2023), Topo-mlp: A simplicial network without message passing, pp. 1–5
  28. ^ Hofer, Christoph; Kwitt, Roland; Niethammer, Marc; Uhl, Andreas (2017). "Deep Learning with Topological Signatures". Advances in Neural Information Processing Systems. 30. Curran Associates, Inc. arXiv:1707.04041.
  29. ^ Hofer, Christoph D.; Kwitt, Roland; Niethammer, Marc (2019). "Learning Representations of Persistence Barcodes". Journal of Machine Learning Research. 20 (126): 1–45. ISSN 1533-7928.
  30. ^ Carriere, Mathieu; Chazal, Frederic; Ike, Yuichi; Lacombe, Theo; Royer, Martin; Umeda, Yuhei (2025-08-05). "PersLay: A Neural Network Layer for Persistence Diagrams and New Graph Topological Signatures". Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics. PMLR: 2786–2796. arXiv:1904.09378.
  31. ^ Kim, Kwangho; Kim, Jisu; Zaheer, Manzil; Kim, Joon; Chazal, Frederic; Wasserman, Larry (2020). "PLLay: Efficient Topological Layer based on Persistent Landscapes". Advances in Neural Information Processing Systems. 33. Curran Associates, Inc.: 15965–15977. arXiv:2002.02778.
  32. ^ Gabrielsson, Rickard Brüel; Nelson, Bradley J.; Dwaraknath, Anjan; Skraba, Primoz (2025-08-05). "A Topology Layer for Machine Learning". Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics. PMLR: 1553–1563.
  33. ^ a b Horn, Max; Brouwer, Edward De; Moor, Michael; Moreau, Yves; Rieck, Bastian; Borgwardt, Karsten (2025-08-05). "Topological Graph Neural Networks". International Conference on Learning Representations.
  34. ^ Hofer, Christoph; Graf, Florian; Rieck, Bastian; Niethammer, Marc; Kwitt, Roland (2025-08-05). "Graph Filtration Learning". Proceedings of the 37th International Conference on Machine Learning. PMLR: 4314–4323. arXiv:1905.10996.
  35. ^ Immonen, Johanna; Souza, Amauri; Garg, Vikas (2025-08-05). "Going beyond persistent homology using persistent homology". Advances in Neural Information Processing Systems. 36: 63150–63173. arXiv:2311.06152.
  36. ^ Battiloro, C.; Di Lorenzo, P.; Ribeiro, A. (September 2023), Parametric dictionary learning for topological signal representation, IEEE, pp. 1958–1962
  37. ^ Wang, C.; Ma, N.; Wu, Z.; Zhang, J.; Yao, Y. (August 2022), Survey of Hypergraph Neural Networks and Its Application to Action Recognition, Springer Nature Switzerland, pp. 387–398
  38. ^ Roddenberry, T. M.; Glaze, N.; Segarra, S. (July 2021), Principled simplicial neural networks for trajectory prediction, PMLR, pp. 9020–9029, arXiv:2102.10058
遮挡车牌属于什么行为 张杰属什么生肖 洋桔梗花语是什么 病人打白蛋白意味着什么 自嘲是什么意思
突然头疼是什么原因 发痧用什么方法好得快 秉承是什么意思 心里不舒服是什么原因 四眼狗有什么迷信说法
2016年是什么生肖 医院查怀孕做什么检查 cf是什么 沪深300是什么意思 打嗝不停吃什么药
什么是恶露 深海鱼油什么牌子好 咖喱是什么材料做的 薛定谔的猫比喻什么 贝母和川贝有什么区别
民航是什么意思hcv8jop3ns3r.cn 什么原因引起静脉曲张hcv8jop0ns8r.cn 未曾谋面什么意思hcv8jop6ns5r.cn 克霉唑为什么4天一次hcv8jop5ns4r.cn 1RM什么意思hcv7jop5ns3r.cn
例假血发黑是什么原因hcv8jop5ns2r.cn 上海是什么省hcv8jop0ns7r.cn 什么是亚健康hcv8jop3ns4r.cn 什么是贵妇脸hcv9jop6ns5r.cn 做梦梦见掉牙齿是什么意思mmeoe.com
吃什么最补血hcv8jop7ns7r.cn 466是什么意思onlinewuye.com 为什么小腿会抽筋hcv9jop5ns3r.cn 珍珠疹是什么原因引起的hcv9jop7ns2r.cn 浇去掉三点水读什么hcv8jop1ns8r.cn
结节灶是什么意思啊hcv8jop8ns8r.cn panadol是什么药jinxinzhichuang.com 查激素六项挂什么科hcv9jop1ns2r.cn 优雅是什么意思hcv7jop6ns7r.cn 秦昊的父母是干什么的mmeoe.com
百度