卡西欧手表什么档次| 出汗有异味是什么原因| 妇科病吃什么药| 88年属龙的是什么命| 手上的月牙代表什么意思| 女人跑马是什么意思| pioneer是什么牌子| 肚子胀气吃什么食物| 什么茶减肥| 梦见玉碎了是什么意思| nt是什么货币| 预热是什么意思| 奋笔疾书的疾是什么意思| 脑萎缩吃什么药能控制| 换什么机油好| 痛风应该挂什么科| 学兽医需要什么学历| 吃多了拉肚子是什么原因| 雪里红是什么| 后背疼挂什么科| 89年五行属什么| 斥巨资是什么意思| 1975属什么生肖| 二十年是什么婚| 什么快递便宜| 鼻尖发红是什么原因| 什么是头寸| 缘字五行属什么| 远香近臭什么意思| 看淡是什么意思| 偶数是什么| 紫砂壶泡什么茶最好| bid什么意思| 介入医学科是什么科室| 陇是什么意思| 人生苦短什么意思| 北五行属什么| 吃什么对胆囊有好处| 属虎生什么属相宝宝好| noon什么意思| 校正是什么意思| cmf是什么| 卵泡刺激素高说明什么| 什么是独角兽企业| 麦字五行属什么| 过敏性鼻炎不能吃什么| 戴玉对身体有什么好处| ckd医学上是什么意思| 五月二十五是什么星座| 做什么行业最赚钱| 倒牙是什么意思| 笑气是什么| 香菇炒什么好吃| fox什么意思| 监护是什么意思| 为什么会早产| 茶颜悦色什么好喝| 容易受惊吓是什么原因| 猪展是什么| 手的皮肤黄是什么原因| 肛裂用什么药治最好效果最快| 肠炎吃什么消炎药| 煮方便面什么时候放鸡蛋| 来月经肚子疼是什么原因| 宝宝拉肚子吃什么好| 下午三点到五点是什么时辰| 扒皮是什么意思| 既济是什么意思| 真维斯属于什么档次| 田共念什么| 哀大莫过于心死是什么意思| 做梦拉粑粑是什么意思| 吃什么容易长高| 唔该是什么意思| 四月十七是什么星座| 二倍体是什么意思| 脾主四肢是什么意思| 稍纵即逝什么意思| 乙字五行属什么| 左甲状腺是什么病| 变色龙吃什么食物| 嬲是什么意思| 硬度不够是什么原因| 答非所问是什么意思| 抹茶是什么茶| 血脂挂什么科| 青蛙吃什么东西| 小便尿起泡是什么原因| 170是什么号码| 维生素b5药店叫什么| 1月13日什么星座| 什么时候怀孕几率最高| 肾在五行中属什么| 前列腺炎吃什么中成药| 腰痛是什么原因| 介怀是什么意思| 水稻什么时候播种| 身上长红点是什么原因| 给老师送花送什么花合适| 酒店尾房是什么意思| 小孩拉肚子吃什么食物| 梦见朋友死了是什么意思| 朝鲜面是什么原料做的| chihiro是什么意思| 苹果的英文是什么| 丁克是什么意思| 尽善尽美是什么意思| 3月4日是什么星座| 什么是可支配收入| 鸡伸脖子张嘴用什么药| 冬天吃什么| 什么是临床医学| 丝瓜配什么炒好吃| 佳偶天成什么意思| 胖次是什么意思| 梦见豹子是什么预兆| grader是什么意思| 男生为什么会晨勃| 腹膜透析是什么意思| 阴道清洁度三度什么意思| 梦到熊是什么意思| 女人下嘴唇厚代表什么| 脆肉鲩是什么鱼| 鸡蛋吃多了有什么危害| 交界性心律是什么意思| 肺鳞癌是什么意思| 麾下什么意思| 查肝功能能查出什么病| 什么动物最安静| 五月底是什么星座| 变化不著是什么意思| 胎儿股骨长是什么意思| 火车票改签是什么意思| 中医七情指的是什么| 复检是什么意思| 黄鼠狼进屋是什么兆头| 舌头裂缝是什么原因| biu是什么意思| 彩铃是什么意思| 嗓子痒痒吃什么药| 粉色象征着什么| 阴茎是什么| 石斛有什么作用和功效| 马属相和什么属相最配| 为什么肚子疼| 肝低回声结节是什么意思| everytime什么意思| 黄瓜籽粉有什么作用| 54年属什么| 有白带发黄是什么原因| 一代明君功千秋是什么生肖| 婧是什么意思| 姨妈期可以吃什么水果| 木色是什么颜色| 蜘蛛喜欢吃什么| 87年属什么的生肖| 八三年属什么生肖| 失眠是什么意思| 孕妇梦见掉牙齿是什么意思| 八仙桌是什么生肖| 遗精是什么原因引起的| 硬下疳长什么样| 五月掉床有什么说法| 见血封喉什么意思| 坐骨神经吃什么药效果最好| 畏首畏尾是什么意思| 缺黄体酮会有什么症状| 经期头疼吃什么药效果最好| 38度吃什么药| 彩金是什么| 缜密是什么意思| 黄皮果什么味道| 浸润性是什么意思| 骨膜炎吃什么药| 天蝎什么象| 梦见抓蛇是什么预兆| 起薪是什么意思| 肾透析是什么意思| 苹果吃了有什么好处| 羊水破了是什么症状| 室上性早搏是什么意思| 牛逼是什么意思| 农历六月初六是什么星座| 纳字五行属什么| 肺气肿是什么原因引起的| 人的肝脏在什么位置| 什么网站可以看毛片| 菜鸟什么意思| 布洛芬什么时候起效| 凡士林是什么| 吃什么长头发又密又多| 女人严重口臭挂什么科| 相伴是什么意思| 党费什么时候开始交| 阳痿什么意思| 甲状腺炎吃什么药好得快| 鼻干眼干口干属于什么症状| npv是什么| 脚腿肿是什么原因引起的| 爱是什么意思| 鼻子出汗多是什么原因| 雷什么风什么成语| 软化血管吃什么药| 泡妞是什么意思啊| 更年期失眠吃什么药| 胎毛什么时候剃最好| 洁面慕斯和洗面奶有什么区别| 荟萃是什么意思| 一月三日是什么星座| 没有什么| ab和b型血生的孩子是什么血型| 谷草谷丙比值偏高说明什么| 潦草什么意思| 足度念什么| 送女朋友什么礼物| 挪揄是什么意思| 属鸡今年要注意什么| 下午17点是什么时辰| 为什么做b超要憋尿| 动物园里有什么动物| 海笋是什么东西| 孕妇吃西红柿对胎儿有什么好处| 灵五行属性是什么| 为什么叫黄牛| 痈是什么| 唵嘛呢叭咪吽是什么意思| 尿里有红细胞是什么原因| 做病理意味着什么| 什么情况下月经推迟| 天天吹空调有什么危害| 氧化铜什么颜色| 阴蒂长什么样| 象牙塔比喻什么| ala是什么氨基酸| 妊娠期是指什么时候| 每个月14号都是什么情人节| 医保统筹是什么意思| 老公不交工资意味什么| 伪骨科是什么| 日本为什么经常地震| 1984年是什么命| 七一年属什么生肖| 血压低是什么症状| 整天放屁是什么原因| 宫外孕做什么手术| 榴莲什么样的好吃| 下肢浮肿是什么原因| 佛光普照什么意思| 尿痛吃什么药效果最好| 尿中红细胞高是什么原因| 卵巢囊肿术后吃什么食物好| 爱是什么感觉| serum是什么意思| 99年属兔的是什么命| 谨言是什么意思| 睾丸萎缩是什么原因| 桌游是什么| 男性感染支原体有什么症状| 阿达子是什么| 早博是什么| 小便痒痒是什么原因女| 叶酸偏高有什么影响| 大姨妈量少什么原因| ca199是什么意思| 百度Jump to content

鸟为什么会飞

From Wikipedia, the free encyclopedia
百度 从国内来看,基本面企稳,去杠杆持续推进,央行或将提高OMO操作利率。

Hebbian theory is a neuropsychological theory claiming that an increase in synaptic efficacy arises from a presynaptic cell's repeated and persistent stimulation of a postsynaptic cell. It is an attempt to explain synaptic plasticity, the adaptation of neurons during the learning process. Hebbian theory was introduced by Donald Hebb in his 1949 book The Organization of Behavior.[1] The theory is also called Hebb's rule, Hebb's postulate, and cell assembly theory. Hebb states it as follows:

Let us assume that the persistence or repetition of a reverberatory activity (or "trace") tends to induce lasting cellular changes that add to its stability. ... When an axon of cell A is near enough to excite a cell B and repeatedly or persistently takes part in firing it, some growth process or metabolic change takes place in one or both cells such that A’s efficiency, as one of the cells firing B, is increased.[1]:?62?

The theory is often summarized as "Neurons that fire together, wire together."[2] However, Hebb emphasized that cell A needs to "take part in firing" cell B, and such causality can occur only if cell A fires just before, not at the same time as, cell B. This aspect of causation in Hebb's work foreshadowed what is now known about spike-timing-dependent plasticity, which requires temporal precedence.[3]

Hebbian theory attempts to explain associative or Hebbian learning, in which simultaneous activation of cells leads to pronounced increases in synaptic strength between those cells. It also provides a biological basis for errorless learning methods for education and memory rehabilitation. In the study of neural networks in cognitive function, it is often regarded as the neuronal basis of unsupervised learning.[4]

Engrams, cell assembly theory, and learning

[edit]

Hebbian theory provides an explanation for how neurons might connect to become engrams, which may be stored in overlapping cell assemblies, or groups of neurons that encode specific information.[5] Initially created as a way to explain recurrent activity in specific groups of cortical neurons, Hebb's theories on the form and function of cell assemblies can be understood from the following:[1]:?70?

The general idea is an old one, that any two cells or systems of cells that are repeatedly active at the same time will tend to become 'associated' so that activity in one facilitates activity in the other.

Hebb also wrote:[1]

When one cell repeatedly assists in firing another, the axon of the first cell develops synaptic knobs (or enlarges them if they already exist) in contact with the soma of the second cell.

D. Alan Allport posits additional ideas regarding cell assembly theory and its role in forming engrams using the concept of auto-association, or the brain's ability to retrieve information based on a partial cue, described as follows:

If the inputs to a system cause the same pattern of activity to occur repeatedly, the set of active elements constituting that pattern will become increasingly strongly inter-associated. That is, each element will tend to turn on every other element and (with negative weights) to turn off the elements that do not form part of the pattern. To put it another way, the pattern as a whole will become 'auto-associated'. We may call a learned (auto-associated) pattern an engram.[6]

Research conducted in the laboratory of Nobel laureate Eric Kandel has provided evidence supporting the role of Hebbian learning mechanisms at synapses in the marine gastropod Aplysia californica.[7] Because synapses in the peripheral nervous system of marine invertebrates are much easier to control in experiments, Kandel's research found that Hebbian long-term potentiation along with activity-dependent presynaptic facilitation are both necessary for synaptic plasticity and classical conditioning in Aplysia californica.[8]

While research on invertebrates has established fundamental mechanisms of learning and memory, much of the work on long-lasting synaptic changes between vertebrate neurons involves the use of non-physiological experimental stimulation of brain cells. However, some of the physiologically relevant synapse modification mechanisms that have been studied in vertebrate brains do seem to be examples of Hebbian processes. One such review indicates that long-lasting changes in synaptic strengths can be induced by physiologically relevant synaptic activity using both Hebbian and non-Hebbian mechanisms.[9]

Principles

[edit]

In artificial neurons and artificial neural networks, Hebb's principle can be described as a method of determining how to alter the weights between model neurons. The weight between two neurons increases if the two neurons activate simultaneously, and reduces if they activate separately. Nodes that tend to be either both positive or both negative at the same time have strong positive weights, while those that tend to be opposite have strong negative weights.

The following is a formulaic description of Hebbian learning (many other descriptions are possible):

where is the weight of the connection from neuron to neuron , and is the input for neuron . This is an example of pattern learning, where weights are updated after every training example. In a Hopfield network, connections are set to zero if (no reflexive connections allowed). With binary neurons (activations either 0 or 1), connections would be set to 1 if the connected neurons have the same activation for a pattern.[citation needed]

When several training patterns are used, the expression becomes an average of the individuals:

where is the weight of the connection from neuron to neuron , is the number of training patterns and the -th input for neuron . This is learning by epoch, with weights updated after all the training examples are presented and is last term applicable to both discrete and continuous training sets. Again, in a Hopfield network, connections are set to zero if (no reflexive connections).

A variation of Hebbian learning that takes into account phenomena such as blocking and other neural learning phenomena is the mathematical model of Harry Klopf. Klopf's model assumes that parts of a system with simple adaptive mechanisms can underlie more complex systems with more advanced adaptive behavior, such as neural networks.[10]

Relationship to unsupervised learning, stability, and generalization

[edit]

Because of the simple nature of Hebbian learning, based only on the coincidence of pre- and post-synaptic activity, it may not be intuitively clear why this form of plasticity leads to meaningful learning. However, it can be shown that Hebbian plasticity does pick up the statistical properties of the input in a way that can be categorized as unsupervised learning.

This can be mathematically shown in a simplified example. Let us work under the simplifying assumption of a single rate-based neuron of rate , whose inputs have rates . The response of the neuron is usually described as a linear combination of its input, , followed by a response function :

As defined in the previous sections, Hebbian plasticity describes the evolution in time of the synaptic weight :

Assuming, for simplicity, an identity response function , we can write

or in matrix form:

As in the previous chapter, if training by epoch is done an average over discrete or continuous (time) training set of can be done:where is the correlation matrix of the input under the additional assumption that (i.e. the average of the inputs is zero). This is a system of coupled linear differential equations. Since is symmetric, it is also diagonalizable, and the solution can be found, by working in its eigenvectors basis, to be of the form

where are arbitrary constants, are the eigenvectors of and their corresponding eigen values. Since a correlation matrix is always a positive-definite matrix, the eigenvalues are all positive, and one can easily see how the above solution is always exponentially divergent in time. This is an intrinsic problem due to this version of Hebb's rule being unstable, as in any network with a dominant signal the synaptic weights will increase or decrease exponentially. Intuitively, this is because whenever the presynaptic neuron excites the postsynaptic neuron, the weight between them is reinforced, causing an even stronger excitation in the future, and so forth, in a self-reinforcing way. One may think a solution is to limit the firing rate of the postsynaptic neuron by adding a non-linear, saturating response function , but in fact, it can be shown that for any neuron model, Hebb's rule is unstable.[11] Therefore, network models of neurons usually employ other learning theories such as BCM theory, Oja's rule,[12] or the generalized Hebbian algorithm.

Regardless, even for the unstable solution above, one can see that, when sufficient time has passed, one of the terms dominates over the others, and

where is the largest eigenvalue of . At this time, the postsynaptic neuron performs the following operation:

Because, again, is the eigenvector corresponding to the largest eigenvalue of the correlation matrix between the s, this corresponds exactly to computing the first principal component of the input.

This mechanism can be extended to performing a full PCA (principal component analysis) of the input by adding further postsynaptic neurons, provided the postsynaptic neurons are prevented from all picking up the same principal component, for example by adding lateral inhibition in the postsynaptic layer. We have thus connected Hebbian learning to PCA, which is an elementary form of unsupervised learning, in the sense that the network can pick up useful statistical aspects of the input, and "describe" them in a distilled way in its output.[13]

Hebbian learning and mirror neurons

[edit]

Hebbian learning and spike-timing-dependent plasticity have been used in an influential theory of how mirror neurons emerge.[14][15] Mirror neurons are neurons that fire both when an individual performs an action and when the individual sees or hears another perform a similar action.[16][17] The discovery of these neurons has been very influential in explaining how individuals make sense of the actions of others, since when a person perceives the actions of others, motor programs in the person's brain which they would use to perform similar actions are activated, which add information to the perception and help to predict what the person will do next based on the perceiver's own motor program. One limitation of this idea of mirror neuron functions is explaining how individuals develop neurons that respond both while performing an action and while hearing or seeing another perform similar actions.

Neuroscientist Christian Keysers and psychologist David Perrett suggested that observing or hearing an individual perform an action activates brain regions as if performing the action oneself.[15][18] These re-afferent sensory signals trigger activity in neurons responding to the sight, sound, and feel of the action. Because the activity of these sensory neurons will consistently overlap in time with those of the motor neurons that caused the action, Hebbian learning predicts that the synapses connecting neurons responding to the sight, sound, and feel of an action and those of the neurons triggering the action should be potentiated. The same is true while people look at themselves in the mirror, hear themselves babble, or are imitated by others. After repeated occurrences of this re-afference, the synapses connecting the sensory and motor representations of an action are so strong that the motor neurons start firing to the sound or the vision of the action, and a mirror neuron is created.[19]

Numerous experiments provide evidence for the idea that Hebbian learning is crucial to the formation of mirror neurons. Evidence reveals that motor programs can be triggered by novel auditory or visual stimuli after repeated pairing of the stimulus with the execution of the motor program.[20] For instance, people who have never played the piano do not activate brain regions involved in playing the piano when listening to piano music. Five hours of piano lessons, in which the participant is exposed to the sound of the piano each time they press a key is proven sufficient to trigger activity in motor regions of the brain upon listening to piano music when heard at a later time.[20] Consistent with the fact that spike-timing-dependent plasticity occurs only if the presynaptic neuron's firing predicts the post-synaptic neuron's firing,[21] the link between sensory stimuli and motor programs also only seem to be potentiated if the stimulus is contingent on the motor program.

Hebbian theory and cognitive neuroscience

[edit]

Hebbian learning is linked to cognitive processes like decision-making and social learning. The field of cognitive neuroscience has started to explore the intersection of Hebbian theory with brain regions responsible for reward processing and social cognition, such as the striatum and prefrontal cortex.[22][23] In particular, striatal projections exposed to Hebbian models exhibit long-term potentiation and long-term depression in vivo.[24] Additionally, models of the prefrontal cortex to stimuli ("mixed selectivity") are not entirely explained by random connectivity, but when a Hebbian paradigm is incorporated, the levels of mixed selectivity in the model are reached.[25] It is hypothesized (e.g., by Peter Putnam and Robert W. Fuller) that Hebbian plasticity in these areas may underlie behaviors like habit formation, reinforcement learning, and even the development of social bonds.[26][27]

Limitations

[edit]

Despite the common use of Hebbian models for long-term potentiation, Hebbian theory does not cover all forms of long-term synaptic plasticity. Hebb did not propose any rules for inhibitory synapses or predictions for anti-causal spike sequences (where the presynaptic neuron fires after the postsynaptic neuron). Synaptic modification may not simply occur only between activated neurons A and B, but at neighboring synapses as well.[28] Therefore, all forms of heterosynaptic plasticity and homeostatic plasticity are considered non-Hebbian. One example is retrograde signaling to presynaptic terminals.[29] The compound most frequently recognized as a retrograde transmitter is nitric oxide, which, due to its high solubility and diffusivity, often exerts effects on nearby neurons.[30] This type of diffuse synaptic modification, known as volume learning, is not included in the traditional Hebbian model.[31]

Contemporary developments, artificial intelligence, and computational advancements

[edit]

Modern research has expanded upon Hebb's original ideas. Spike-timing-dependent plasticity (STDP), for example, refines Hebbian principles by incorporating the precise timing of neuronal spikes to Hebbian theory. Experimental advancements have also linked Hebbian learning to complex behaviors, such as decision-making and emotional regulation.[13] Current studies in artificial intelligence (AI) and quantum computing continue to leverage Hebbian concepts for developing adaptive algorithms and improving machine learning models.[32]

In AI, Hebbian learning has seen applications beyond traditional neural networks. One significant advancement is in reinforcement learning algorithms, where Hebbian-like learning is used to update the weights based on the timing and strength of stimuli during training phases. Some researchers have adapted Hebbian principles to develop more biologically plausible models for learning in artificial systems, which may improve model efficiency and convergence in AI applications. [33] [34]

A growing area of interest is the application of Hebbian learning in quantum computing. While classical neural networks are the primary area of application for Hebbian theory, recent studies have begun exploring the potential for quantum-inspired algorithms. These algorithms leverage the principles of quantum superposition and entanglement to enhance learning processes in quantum systems.[35]Current research is exploring how Hebbian principles could inform the development of more efficient quantum machine learning models.[3]

New computational models have emerged that refine or extend Hebbian learning. For example, some models now account for the precise timing of neural spikes (as in spike-timing-dependent plasticity), while others have integrated aspects of neuromodulation to account for how neurotransmitters like dopamine affect the strength of synaptic connections. These advanced models provide a more nuanced understanding of how Hebbian learning operates in the brain and are contributing to the development of more realistic computational models. [36] [37]

Recent research on Hebbian learning has focused on the role of inhibitory neurons, which are often overlooked in traditional Hebbian models. While classic Hebbian theory primarily focuses on excitatory neurons, more comprehensive models of neural learning now consider the balanced interaction between excitatory and inhibitory synapses. Studies suggest that inhibitory neurons can provide critical regulation for maintaining stability in neural circuits and might prevent runaway positive feedback in Hebbian learning.[38][39]

See also

[edit]

References

[edit]
  1. ^ a b c d Hebb, D.O. (1949). The Organization of Behavior. New York: Wiley & Sons.
  2. ^ Siegrid L?wel, G?ttingen University; The exact sentence is: "neurons wire together if they fire together" (L?wel, S. and Singer, W. (1992) Science 255 (published January 10, 1992) L?wel, Siegrid; Singer, Wolf (1992). "Selection of Intrinsic Horizontal Connections in the Visual Cortex by Correlated Neuronal Activity". Science Magazine. 255 (5041). United States: American Association for the Advancement of Science: 209–212. Bibcode:1992Sci...255..209L. doi:10.1126/science.1372754. ISSN 0036-8075. PMID 1372754.
  3. ^ a b Caporale, Natalia; Dan, Yang (2008). "Spike timing-dependent plasticity: a Hebbian learning rule". Annual Review of Neuroscience. 31: 25–46. doi:10.1146/annurev.neuro.31.060407.125639. PMID 18275283.
  4. ^ Sanger, Terence D. (2025-08-05). "Optimal unsupervised learning in a single-layer linear feedforward neural network". Neural Networks. 2 (6): 459–473. doi:10.1016/0893-6080(89)90044-0. ISSN 0893-6080.
  5. ^ Sejnowski, Terrence J. (2025-08-05). "The Book of Hebb". Neuron. 24 (4): 773–776. doi:10.1016/S0896-6273(00)81025-9. ISSN 0896-6273. PMID 10624941.
  6. ^ Allport, D.A. (1985). "Distributed memory, modular systems and dysphasia". In Newman, S.K.; Epstein R. (eds.). Current Perspectives in Dysphasia. Edinburgh: Churchill Livingstone. ISBN 978-0-443-03039-0.
  7. ^ Castellucci, Vincent; Pinsker, Harold; Kupfermann, Irving; Kandel, Eric R. (2025-08-05). "Neuronal Mechanisms of Habituation and Dishabituation of the Gill-Withdrawal Reflex in Aplysia". Science. 167 (3926): 1745–1748. Bibcode:1970Sci...167.1745C. doi:10.1126/science.167.3926.1745. PMID 5416543.
  8. ^ Antonov, Igor; Antonova, Irina; Kandel, Eric R.; Hawkins, Robert D. (2025-08-05). "Activity-Dependent Presynaptic Facilitation and Hebbian LTP Are Both Required and Interact during Classical Conditioning in Aplysia". Neuron. 37 (1): 135–147. doi:10.1016/S0896-6273(02)01129-7. ISSN 0896-6273. PMID 12526779.
  9. ^ Paulsen, O; Sejnowski, T (1 April 2000). "Natural patterns of activity and long-term synaptic plasticity". Current Opinion in Neurobiology. 10 (2): 172–180. doi:10.1016/s0959-4388(00)00076-3. PMC 2900254. PMID 10753798.
  10. ^ Klopf, A. H. (1972). Brain function and adaptive systems—A heterostatic theory. Technical Report AFCRL-72-0164, Air Force Cambridge Research Laboratories, Bedford, MA.
  11. ^ Euliano, Neil R. (2025-08-05). "Neural and Adaptive Systems: Fundamentals Through Simulations" (PDF). Wiley. Archived from the original (PDF) on 2025-08-05. Retrieved 2025-08-05.
  12. ^ Shouval, Harel (2025-08-05). "The Physics of the Brain". The Synaptic basis for Learning and Memory: A theoretical approach. The University of Texas Health Science Center at Houston. Archived from the original on 2025-08-05. Retrieved 2025-08-05.
  13. ^ a b Kistler, Werner M.; Gerstner, Wulfram, eds. (2002), "Hebbian models", Spiking Neuron Models: Single Neurons, Populations, Plasticity, Cambridge: Cambridge University Press, pp. 351–386, doi:10.1017/CBO9780511815706.011, ISBN 978-0-521-89079-3, retrieved 2025-08-05
  14. ^ Keysers C; Perrett DI (2004). "Demystifying social cognition: a Hebbian perspective". Trends in Cognitive Sciences. 8 (11): 501–507. doi:10.1016/j.tics.2004.09.005. PMID 15491904. S2CID 8039741.
  15. ^ a b Keysers, C. (2011). The Empathic Brain.
  16. ^ Gallese V; Fadiga L; Fogassi L; Rizzolatti G (1996). "Action recognition in the premotor cortex". Brain. 119 (Pt 2): 593–609. doi:10.1093/brain/119.2.593. PMID 8800951.
  17. ^ Keysers C; Kohler E; Umilta MA; Nanetti L; Fogassi L; Gallese V (2003). "Audiovisual mirror neurons and action recognition". Exp Brain Res. 153 (4): 628–636. CiteSeerX 10.1.1.387.3307. doi:10.1007/s00221-003-1603-5. PMID 12937876. S2CID 7704309.
  18. ^ Kohler, Evelyne; Keysers, Christian; Umiltà, M. Alessandra; Fogassi, Leonardo; Gallese, Vittorio; Rizzolatti, Giacomo (2025-08-05). "Hearing Sounds, Understanding Actions: Action Representation in Mirror Neurons". Science. 297 (5582): 846–848. Bibcode:2002Sci...297..846K. doi:10.1126/science.1070311. PMID 12161656.
  19. ^ "Hebbian learning and predictive mirror neurons for actions, sensations and emotions". ResearchGate. Archived from the original on 2025-08-05. Retrieved 2025-08-05.
  20. ^ a b Lahav A; Saltzman E; Schlaug G (2007). "Action representation of sound: audiomotor recognition network while listening to newly acquired actions". J Neurosci. 27 (2): 308–314. doi:10.1523/jneurosci.4822-06.2007. PMC 6672064. PMID 17215391.
  21. ^ Bauer EP; LeDoux JE; Nader K (2001). "Fear conditioning and LTP in the lateral amygdala are sensitive to the same stimulus contingencies". Nat Neurosci. 4 (7): 687–688. doi:10.1038/89465. PMID 11426221. S2CID 33130204.
  22. ^ Balleine, B. W., & O'Doherty, J. P. (2010). Human and rodent homologies in action control: Corticostriatal mechanisms of reward and decision making. *Neuroscience*, 164(1), 131-140.
  23. ^ O'Doherty, J. P., et al. (2004). Dissociable roles of ventral and dorsal striatum in instrumental conditioning. *Science*, 304(5668), 452-454.
  24. ^ Perrin, Elodie; Venance, Laurent (2025-08-05). "Bridging the gap between striatal plasticity and learning". Current Opinion in Neurobiology. Neurobiology of Learning and Plasticity. 54: 104–112. doi:10.1016/j.conb.2018.09.007. ISSN 0959-4388. PMID 30321866.
  25. ^ Lindsay, Grace W.; Rigotti, Mattia; Warden, Melissa R.; Miller, Earl K.; Fusi, Stefano (2025-08-05). "Hebbian Learning in a Random Network Captures Selectivity Properties of the Prefrontal Cortex". Journal of Neuroscience. 37 (45): 11021–11036. doi:10.1523/JNEUROSCI.1222-17.2017. ISSN 0270-6474. PMC 5678026. PMID 28986463.
  26. ^ Gefter, Amanda (2025-08-05). "Finding Peter Putnam". Nautilus. Retrieved 2025-08-05.
  27. ^ Putnam, Peter; Fuller, Robert (2025-08-05). "Outline of a Functional Model of the Nervous System, Putnam/Fuller 1964". The Peter Putnam Papers. Retrieved 2025-08-05.
  28. ^ Horgan, John (May 1994). "Neural eavesdropping". Scientific American. 270 (5): 16. Bibcode:1994SciAm.270e..16H. doi:10.1038/scientificamerican0594-16. PMID 8197441.
  29. ^ Fitzsimonds, Reiko; Mu-Ming Poo (January 1998). "Retrograde Signaling in the Development and Modification of Synapses". Physiological Reviews. 78 (1): 143–170. doi:10.1152/physrev.1998.78.1.143. PMID 9457171. S2CID 11604896.
  30. ^ López, P; C.P. Araujo (2009). "A computational study of the diffuse neighbourhoods in biological and artificial neural networks" (PDF). International Joint Conference on Computational Intelligence.
  31. ^ Mitchison, G; N. Swindale (October 1999). "Can Hebbian Volume Learning Explain Discontinuities in Cortical Maps?". Neural Computation. 11 (7): 1519–1526. doi:10.1162/089976699300016115. PMID 10490935. S2CID 2325474.
  32. ^ Bi, G.-Q., & Poo, M.-M. (1998). Synaptic modifications in cultured hippocampal neurons: Dependence on spike timing, synaptic strength, and postsynaptic cell type. *Journal of Neuroscience*, 18(24), 10464–10472.
  33. ^ Oja, E. (1982). A simplified neuron model as a principal component analyzer. *Journal of Mathematical Biology*, 15(3), 267–273.
  34. ^ Rumelhart, D. E., & McClelland, J. L. (1986). *Parallel Distributed Processing: Explorations in the Microstructure of Cognition*. MIT Press.
  35. ^ Huang, H., & Li, Y. (2019). A Quantum-Inspired Hebbian Learning Algorithm for Neural Networks. *Journal of Quantum Information Science*, 9(2), 111-124.
  36. ^ Miller, P., & Conver, A. (2012). Computational models of synaptic plasticity and learning. *Current Opinion in Neurobiology*, 22(5), 648-655.
  37. ^ Bé?que, J. C., & Andrade, R. (2012). Neuromodulation of synaptic plasticity in the hippocampus: Implications for learning and memory. *Frontiers in Synaptic Neuroscience*, 4, 15.
  38. ^ Markram, H., Lübke, J., Frotscher, M., & Sakmann, B. (1997). Regulation of synaptic efficacy by coincidence of postsynaptic APs and EPSPs. *Science*, 275(5297), 213–215.
  39. ^ Cohen, M. R., & Kohn, A. (2011). Measuring and interpreting neuronal correlations. *Nature Neuroscience*, 14(7), 811-819.

Further reading

[edit]
[edit]
禾加一笔是什么字 雅戈尔男装什么档次 吃什么食物补肾最快 什么是宫刑 一代表什么意思
吃力不讨好是什么意思 口舌是什么意思 肠系膜淋巴结肿大吃什么药 文艺兵是干什么的 东莞有什么好玩的地方
聚乙二醇是什么 螃蟹和什么食物相克 菲妮迪女装是什么档次 茄子吃了有什么好处 脑梗是什么原因引起的
念珠菌用什么药最好 女性hpv是什么意思 sma是什么 什么是结缔组织病 siri什么意思
肉蔻是什么样子hcv7jop9ns5r.cn 人体最大的消化腺是什么hcv8jop6ns4r.cn 整编师和师有什么区别hcv8jop1ns0r.cn 血小板高是什么原因yanzhenzixun.com premier是什么牌子hcv7jop7ns1r.cn
头发白是什么原因引起的hcv9jop6ns1r.cn 网约车是什么意思hcv8jop5ns2r.cn mbi是什么意思zhongyiyatai.com 胰腺炎用什么药hcv9jop7ns1r.cn 浅表性胃炎吃什么中成药最好shenchushe.com
属鸡的幸运色是什么颜色hcv8jop5ns9r.cn 气性坏疽是什么病hcv8jop3ns9r.cn 头顶痛是什么原因hcv7jop4ns5r.cn 勤去掉力念什么hcv9jop6ns8r.cn 朋友的意义是什么hcv9jop0ns0r.cn
中暑喝什么youbangsi.com 哆啦a梦的口袋叫什么hcv9jop3ns4r.cn 微信什么时候推出的hcv9jop1ns4r.cn 贫血的人适合喝什么茶xinmaowt.com 急性胃炎吃什么药好hcv9jop4ns8r.cn
百度