Neural network as a nonlinear system?












1














I defined a very simple neural network of $2$ inputs, $1$ hidden layer with $2$ nodes, and one output node.
For each input pattern $x⃗ ∈ ℝ×ℝ$
and associated output $o∈ℝ$, the resulting nonlinear equation is:



$wo_{0} σ(x_0 Wi{00} + x_1 Wi{10}) + wo{1} σ(x_0 Wi{01} + x_1 Wi{11}) = o$



where $Wi$ is the weight matrix of order $2×2$, where each element $Wi_{jk} in ℝ$, of input connections, $σ(x)=frac{1}{1+exp(−x)}$, and $vec{wo}$, with $wo_{i} in ℝ$, is the weight vector of the two output connections before the output node.



Given a dataset of $n$ (pattern, output) examples, there will be $n$ nonlinear equations.



I'm asking how to find the solutions of those nonlinear systems, as an alternative method to solve the learning problem, without backpropagation. I've implemented an optimizer for the stated probem. If someone is interested I can provide the relative C sources (email: fportera2@gmail.com).










share|cite|improve this question
























  • Genetic algorithms?
    – N74
    Nov 29 at 21:46










  • I would like to implement a g.a. method. Are you sure it will perform better than the gradient method, if I can ask the question?
    – Filippo Portera
    Dec 2 at 18:49










  • I am sure that, for the simple problem you are facing, the back propagation is the best way. But you asked for an alternative method, and genetic algorithms are able to span bigger parameter spaces and find global optima, instead of being locked in a local one.
    – N74
    Dec 2 at 22:28
















1














I defined a very simple neural network of $2$ inputs, $1$ hidden layer with $2$ nodes, and one output node.
For each input pattern $x⃗ ∈ ℝ×ℝ$
and associated output $o∈ℝ$, the resulting nonlinear equation is:



$wo_{0} σ(x_0 Wi{00} + x_1 Wi{10}) + wo{1} σ(x_0 Wi{01} + x_1 Wi{11}) = o$



where $Wi$ is the weight matrix of order $2×2$, where each element $Wi_{jk} in ℝ$, of input connections, $σ(x)=frac{1}{1+exp(−x)}$, and $vec{wo}$, with $wo_{i} in ℝ$, is the weight vector of the two output connections before the output node.



Given a dataset of $n$ (pattern, output) examples, there will be $n$ nonlinear equations.



I'm asking how to find the solutions of those nonlinear systems, as an alternative method to solve the learning problem, without backpropagation. I've implemented an optimizer for the stated probem. If someone is interested I can provide the relative C sources (email: fportera2@gmail.com).










share|cite|improve this question
























  • Genetic algorithms?
    – N74
    Nov 29 at 21:46










  • I would like to implement a g.a. method. Are you sure it will perform better than the gradient method, if I can ask the question?
    – Filippo Portera
    Dec 2 at 18:49










  • I am sure that, for the simple problem you are facing, the back propagation is the best way. But you asked for an alternative method, and genetic algorithms are able to span bigger parameter spaces and find global optima, instead of being locked in a local one.
    – N74
    Dec 2 at 22:28














1












1








1


1





I defined a very simple neural network of $2$ inputs, $1$ hidden layer with $2$ nodes, and one output node.
For each input pattern $x⃗ ∈ ℝ×ℝ$
and associated output $o∈ℝ$, the resulting nonlinear equation is:



$wo_{0} σ(x_0 Wi{00} + x_1 Wi{10}) + wo{1} σ(x_0 Wi{01} + x_1 Wi{11}) = o$



where $Wi$ is the weight matrix of order $2×2$, where each element $Wi_{jk} in ℝ$, of input connections, $σ(x)=frac{1}{1+exp(−x)}$, and $vec{wo}$, with $wo_{i} in ℝ$, is the weight vector of the two output connections before the output node.



Given a dataset of $n$ (pattern, output) examples, there will be $n$ nonlinear equations.



I'm asking how to find the solutions of those nonlinear systems, as an alternative method to solve the learning problem, without backpropagation. I've implemented an optimizer for the stated probem. If someone is interested I can provide the relative C sources (email: fportera2@gmail.com).










share|cite|improve this question















I defined a very simple neural network of $2$ inputs, $1$ hidden layer with $2$ nodes, and one output node.
For each input pattern $x⃗ ∈ ℝ×ℝ$
and associated output $o∈ℝ$, the resulting nonlinear equation is:



$wo_{0} σ(x_0 Wi{00} + x_1 Wi{10}) + wo{1} σ(x_0 Wi{01} + x_1 Wi{11}) = o$



where $Wi$ is the weight matrix of order $2×2$, where each element $Wi_{jk} in ℝ$, of input connections, $σ(x)=frac{1}{1+exp(−x)}$, and $vec{wo}$, with $wo_{i} in ℝ$, is the weight vector of the two output connections before the output node.



Given a dataset of $n$ (pattern, output) examples, there will be $n$ nonlinear equations.



I'm asking how to find the solutions of those nonlinear systems, as an alternative method to solve the learning problem, without backpropagation. I've implemented an optimizer for the stated probem. If someone is interested I can provide the relative C sources (email: fportera2@gmail.com).







nonlinear-system






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Nov 30 at 7:55

























asked Nov 29 at 21:27









Filippo Portera

112




112












  • Genetic algorithms?
    – N74
    Nov 29 at 21:46










  • I would like to implement a g.a. method. Are you sure it will perform better than the gradient method, if I can ask the question?
    – Filippo Portera
    Dec 2 at 18:49










  • I am sure that, for the simple problem you are facing, the back propagation is the best way. But you asked for an alternative method, and genetic algorithms are able to span bigger parameter spaces and find global optima, instead of being locked in a local one.
    – N74
    Dec 2 at 22:28


















  • Genetic algorithms?
    – N74
    Nov 29 at 21:46










  • I would like to implement a g.a. method. Are you sure it will perform better than the gradient method, if I can ask the question?
    – Filippo Portera
    Dec 2 at 18:49










  • I am sure that, for the simple problem you are facing, the back propagation is the best way. But you asked for an alternative method, and genetic algorithms are able to span bigger parameter spaces and find global optima, instead of being locked in a local one.
    – N74
    Dec 2 at 22:28
















Genetic algorithms?
– N74
Nov 29 at 21:46




Genetic algorithms?
– N74
Nov 29 at 21:46












I would like to implement a g.a. method. Are you sure it will perform better than the gradient method, if I can ask the question?
– Filippo Portera
Dec 2 at 18:49




I would like to implement a g.a. method. Are you sure it will perform better than the gradient method, if I can ask the question?
– Filippo Portera
Dec 2 at 18:49












I am sure that, for the simple problem you are facing, the back propagation is the best way. But you asked for an alternative method, and genetic algorithms are able to span bigger parameter spaces and find global optima, instead of being locked in a local one.
– N74
Dec 2 at 22:28




I am sure that, for the simple problem you are facing, the back propagation is the best way. But you asked for an alternative method, and genetic algorithms are able to span bigger parameter spaces and find global optima, instead of being locked in a local one.
– N74
Dec 2 at 22:28















active

oldest

votes











Your Answer





StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3019243%2fneural-network-as-a-nonlinear-system%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown






























active

oldest

votes













active

oldest

votes









active

oldest

votes






active

oldest

votes
















draft saved

draft discarded




















































Thanks for contributing an answer to Mathematics Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


Use MathJax to format equations. MathJax reference.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3019243%2fneural-network-as-a-nonlinear-system%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Wiesbaden

Marschland

Dieringhausen