<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
		<id>https://www.scipedia.com/wd/index.php?action=history&amp;feed=atom&amp;title=Ying_et_al_2019a</id>
		<title>Ying et al 2019a - Revision history</title>
		<link rel="self" type="application/atom+xml" href="https://www.scipedia.com/wd/index.php?action=history&amp;feed=atom&amp;title=Ying_et_al_2019a"/>
		<link rel="alternate" type="text/html" href="https://www.scipedia.com/wd/index.php?title=Ying_et_al_2019a&amp;action=history"/>
		<updated>2026-04-24T10:05:05Z</updated>
		<subtitle>Revision history for this page on the wiki</subtitle>
		<generator>MediaWiki 1.27.0-wmf.10</generator>

	<entry>
		<id>https://www.scipedia.com/wd/index.php?title=Ying_et_al_2019a&amp;diff=194483&amp;oldid=prev</id>
		<title>Scipediacontent: Scipediacontent moved page Draft Content 127246759 to Ying et al 2019a</title>
		<link rel="alternate" type="text/html" href="https://www.scipedia.com/wd/index.php?title=Ying_et_al_2019a&amp;diff=194483&amp;oldid=prev"/>
				<updated>2021-01-28T21:04:33Z</updated>
		
		<summary type="html">&lt;p&gt;Scipediacontent moved page &lt;a href=&quot;/public/Draft_Content_127246759&quot; class=&quot;mw-redirect&quot; title=&quot;Draft Content 127246759&quot;&gt;Draft Content 127246759&lt;/a&gt; to &lt;a href=&quot;/public/Ying_et_al_2019a&quot; title=&quot;Ying et al 2019a&quot;&gt;Ying et al 2019a&lt;/a&gt;&lt;/p&gt;
&lt;table class=&quot;diff diff-contentalign-left&quot; data-mw=&quot;interface&quot;&gt;
				&lt;tr style='vertical-align: top;' lang='en'&gt;
				&lt;td colspan='1' style=&quot;background-color: white; color:black; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan='1' style=&quot;background-color: white; color:black; text-align: center;&quot;&gt;Revision as of 21:04, 28 January 2021&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan='2' style='text-align: center;' lang='en'&gt;&lt;div class=&quot;mw-diff-empty&quot;&gt;(No difference)&lt;/div&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;/table&gt;</summary>
		<author><name>Scipediacontent</name></author>	</entry>

	<entry>
		<id>https://www.scipedia.com/wd/index.php?title=Ying_et_al_2019a&amp;diff=194482&amp;oldid=prev</id>
		<title>Scipediacontent: Created page with &quot; == Abstract ==  Online learning algorithms update models via one sample per iteration, thus efficient to process large-scale datasets and useful to detect malicious events fo...&quot;</title>
		<link rel="alternate" type="text/html" href="https://www.scipedia.com/wd/index.php?title=Ying_et_al_2019a&amp;diff=194482&amp;oldid=prev"/>
				<updated>2021-01-28T21:04:27Z</updated>
		
		<summary type="html">&lt;p&gt;Created page with &amp;quot; == Abstract ==  Online learning algorithms update models via one sample per iteration, thus efficient to process large-scale datasets and useful to detect malicious events fo...&amp;quot;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;&lt;br /&gt;
== Abstract ==&lt;br /&gt;
&lt;br /&gt;
Online learning algorithms update models via one sample per iteration, thus efficient to process large-scale datasets and useful to detect malicious events for social benefits, such as disease outbreak and traffic congestion on the fly. However, existing algorithms for graph-structured models focused on the offline setting and the least square loss, incapable for online setting, while methods designed for online setting cannot be directly applied to the problem of complex (usually non-convex) graph-structured sparsity model. To address these limitations, in this paper we propose a new algorithm for graph-structured sparsity constraint problems under online setting, which we call \textsc{GraphDA}. The key part in \textsc{GraphDA} is to project both averaging gradient (in dual space) and primal variables (in primal space) onto lower dimensional subspaces, thus capturing the graph-structured sparsity effectively. Furthermore, the objective functions assumed here are generally convex so as to handle different losses for online learning settings. To the best of our knowledge, \textsc{GraphDA} is the first online learning algorithm for graph-structure constrained optimization problems. To validate our method, we conduct extensive experiments on both benchmark graph and real-world graph datasets. Our experiment results show that, compared to other baseline methods, \textsc{GraphDA} not only improves classification performance, but also successfully captures graph-structured features more effectively, hence stronger interpretability.&lt;br /&gt;
&lt;br /&gt;
Comment: 11 pages, 14 figure&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
== Original document ==&lt;br /&gt;
&lt;br /&gt;
The different versions of the original document can be found in:&lt;br /&gt;
&lt;br /&gt;
* [http://arxiv.org/abs/1905.10714 http://arxiv.org/abs/1905.10714]&lt;br /&gt;
&lt;br /&gt;
* [https://dl.acm.org/doi/pdf/10.1145/3292500.3330915 https://dl.acm.org/doi/pdf/10.1145/3292500.3330915]&lt;br /&gt;
&lt;br /&gt;
* [https://dblp.uni-trier.de/db/conf/kdd/kdd2019.html#Zhou0Y19 https://dblp.uni-trier.de/db/conf/kdd/kdd2019.html#Zhou0Y19],&lt;br /&gt;
: [https://arxiv.org/pdf/1905.10714 https://arxiv.org/pdf/1905.10714],&lt;br /&gt;
: [https://arxiv.org/pdf/1905.10714v1 https://arxiv.org/pdf/1905.10714v1],&lt;br /&gt;
: [https://dl.acm.org/doi/pdf/10.1145/3292500.3330915 https://dl.acm.org/doi/pdf/10.1145/3292500.3330915],&lt;br /&gt;
: [https://academic.microsoft.com/#/detail/2946515652 https://academic.microsoft.com/#/detail/2946515652]&lt;br /&gt;
&lt;br /&gt;
* [https://dl.acm.org/doi/pdf/10.1145/3292500.3330915 https://dl.acm.org/doi/pdf/10.1145/3292500.3330915],&lt;br /&gt;
: [http://dx.doi.org/10.1145/3292500.3330915 http://dx.doi.org/10.1145/3292500.3330915] under the license http://www.acm.org/publications/policies/copyright_policy#Background&lt;/div&gt;</summary>
		<author><name>Scipediacontent</name></author>	</entry>

	</feed>