꼭 읽어주셨으면 좋겠습니다. 메모리관리문제

somedayhow의 이미지

동적을 할당을 하여서

그 자리에 메모리가 잘 들어간것까지 확인했습니다.

그런데 메모리 할당과는 아무런 관련이 없는 루프를 돌다가

할당해주었던 메모리가 사라집니다. (주소가 ex)0x60301 ->0x38)

제 컴퓨터 메모리가 그렇게 작은것도 아니고(8GB입니다) 할당을 그렇게 많이 시킨것도 아닙니다.

다른 녀석들 다 괜찮은데 수많은 배열중에 하나만 이렇게 말썽을 부립니다.

꼭 좀 읽어주시기 바랍니다.

아래 코드를 통해 Network->Layer[0].Neuron[i].Dendrite에 동적할당을 했습니다.

Network->Layer[0].Neuron = (struct Neurons *)malloc(sizeof(struct Neurons * ) * NeuronCount[0]);
	//for input output connection
	for( i = 0 ; i < NeuronCount[0] ; i++)
	{
		Network->Layer[0].Neuron[i].Dendrite = malloc(sizeof(struct Dendrites *) * NeuronCount[LayerCount-1]);
        }

결과를 보면

7: Network->Layer[0].Neuron[6].Dendrite = (struct Dendrites *) 0x608e70
6: Network->Layer[0].Neuron[5].Dendrite = (struct Dendrites *) 0x608de0
5: Network->Layer[0].Neuron[4].Dendrite = (struct Dendrites *) 0x608d50
4: Network->Layer[0].Neuron[3].Dendrite = (struct Dendrites *) 0x608cc0
3: Network->Layer[0].Neuron[2].Dendrite = (struct Dendrites *) 0x608c30
2: Network->Layer[0].Neuron[1].Dendrite = (struct Dendrites *) 0x608ba0
1: Network->Layer[0].Neuron[0].Dendrite = (struct Dendrites *) 0x608b10

다 잘들어가 있습니다.
루프 끝까지 잘 들어갑니다.

하지만

다음루프의 k부분에서 Network->Layer[0].Neuron[4].Dendrite 의 주소가 갑자기 바뀌어 버립니다.
딱 k루프를 돌다가 갑자기 사라집니다.

나머지것들은 다 그대로 있는데 말입니다.

몇번을 돌려도 이 녀석만 갑자기 할당이 해제가 됩니다. ㅜㅜ 아무리 디버깅을 해도 제실력으로는 원인을 모르겠습니다.

for(i=0 ; i < Network->LayerCount ; i++) //intialize all layers
	{
		for(j=0; j <  Network->Layer[i].NeuronCount ;j++)
		{
 
			//if dividing multi area using swtich use (10/3)=>0~2,3~5,6~8,9~10
			if(i==0) 
			{
			//Do nothing Because Layer'1'(Layer[0]) is input Layer
			//But need to initialize
			//This program supposes there's no bias at input Layer
			//This program use Weight as axon and dendrite and it exists in latter Neuron
			//It means no interconnection between 
			//input Layer with output Layer's Neuron 
			//It also means there's no meaning of weight in the first Layer's Neurons
			//Maybe we can use this realizing output-to-input feedback , Maybe easier way exists
 
			Network->Layer[i].Neuron[j].Bias = 0;
				for( k =0 ; k < Network->Layer[Network->LayerCount-1].NeuronCount ; k++)
				{
					Network->Layer[i].Neuron[j].Dendrite[k].Weight = 0;            //이 루프를 돌다가 할당받은 메모리 하나가 해제
				}
			}

루프들어가기 전까지 전부다 잘 있다가
(gdb) display Network->Layer[0].Neuron[4].Dendrite
5: Network->Layer[0].Neuron[4].Dendrite = (struct Dendrites *) 0x608d50

240 for( k =0 ; k < Network->Layer[Network->LayerCount-1].NeuronCount ; k++)
(gdb) display Network->Layer[0].Neuron[0].Dendrite
1: Network->Layer[0].Neuron[0].Dendrite = (struct Dendrites *) 0x608b10
(gdb) display Network->Layer[0].Neuron[1].Dendrite
2: Network->Layer[0].Neuron[1].Dendrite = (struct Dendrites *) 0x608ba0
(gdb) display Network->Layer[0].Neuron[2].Dendrite
3: Network->Layer[0].Neuron[2].Dendrite = (struct Dendrites *) 0x608c30
(gdb) display Network->Layer[0].Neuron[3].Dendrite
4: Network->Layer[0].Neuron[3].Dendrite = (struct Dendrites *) 0x608cc0
(gdb) display Network->Layer[0].Neuron[4].Dendrite
5: Network->Layer[0].Neuron[4].Dendrite = (struct Dendrites *) 0x608d50
(gdb) display Network->Layer[0].Neuron[5].Dendrite
6: Network->Layer[0].Neuron[5].Dendrite = (struct Dendrites *) 0x608de0
(gdb) display Network->Layer[0].Neuron[6].Dendrite
7: Network->Layer[0].Neuron[6].Dendrite = (struct Dendrites *) 0x608e70
(gdb) display Network->Layer[0].Neuron[7].Dendrite
8: Network->Layer[0].Neuron[7].Dendrite = (struct Dendrites *) 0x608f00
(gdb) display Network->Layer[0].Neuron[8].Dendrite
9: Network->Layer[0].Neuron[8].Dendrite = (struct Dendrites *) 0x608f90
(gdb) display Network->Layer[0].Neuron[9].Dendrite
10: Network->Layer[0].Neuron[9].Dendrite = (struct Dendrites *) 0x609020
(gdb) display Network->Layer[0].Neuron[10].Dendrite
11: Network->Layer[0].Neuron[10].Dendrite = (struct Dendrites *) 0x6090b0
(gdb) display Network->Layer[0].Neuron[11].Dendrite
12: Network->Layer[0].Neuron[11].Dendrite = (struct Dendrites *) 0x609140
(gdb) display Network->Layer[0].Neuron[12].Dendrite
13: Network->Layer[0].Neuron[12].Dendrite = (struct Dendrites *) 0x6091d0
(gdb) display Network->Layer[0].Neuron[13].Dendrite
14: Network->Layer[0].Neuron[13].Dendrite = (struct Dendrites *) 0x609260
(gdb) display Network->Layer[0].Neuron[14].Dendrite
15: Network->Layer[0].Neuron[14].Dendrite = (struct Dendrites *) 0x6092f0
(gdb) display Network->Layer[0].Neuron[15].Dendrite
16: Network->Layer[0].Neuron[15].Dendrite = (struct Dendrites *) 0x609380
(gdb) display Network->Layer[0].Neuron[16].Dendrite

디스플레이 5번에서 보이는것처럼
갑자기 바뀌어 버립니다.
5: Network->Layer[0].Neuron[4].Dendrite = (struct Dendrites *) 0x0

240 for( k =0 ; k < Network->Layer[Network->LayerCount-1].NeuronCount ; k++)
16: Network->Layer[0].Neuron[15].Dendrite = (struct Dendrites *) 0x609380
15: Network->Layer[0].Neuron[14].Dendrite = (struct Dendrites *) 0x6092f0
14: Network->Layer[0].Neuron[13].Dendrite = (struct Dendrites *) 0x609260
13: Network->Layer[0].Neuron[12].Dendrite = (struct Dendrites *) 0x6091d0
12: Network->Layer[0].Neuron[11].Dendrite = (struct Dendrites *) 0x609140
11: Network->Layer[0].Neuron[10].Dendrite = (struct Dendrites *) 0x6090b0
10: Network->Layer[0].Neuron[9].Dendrite = (struct Dendrites *) 0x609020
9: Network->Layer[0].Neuron[8].Dendrite = (struct Dendrites *) 0x608f90
8: Network->Layer[0].Neuron[7].Dendrite = (struct Dendrites *) 0x608f00
7: Network->Layer[0].Neuron[6].Dendrite = (struct Dendrites *) 0x608e70
6: Network->Layer[0].Neuron[5].Dendrite = (struct Dendrites *) 0x608de0
5: Network->Layer[0].Neuron[4].Dendrite = (struct Dendrites *) 0x0
4: Network->Layer[0].Neuron[3].Dendrite = (struct Dendrites *) 0x608cc0
3: Network->Layer[0].Neuron[2].Dendrite = (struct Dendrites *) 0x608c30
2: Network->Layer[0].Neuron[1].Dendrite = (struct Dendrites *) 0x608ba0
1: Network->Layer[0].Neuron[0].Dendrite = (struct Dendrites *) 0x608b10

아래는 코드 전문입니다.

허접한 실력가지고 밤낮으로 나름 고생하며 짰는데 정말 난감합니다.

아무리 돌려보고 살펴봐도 k루프에서 저렇게 고장이 날 이유가 없는것 같은데 뭐가 문제일까요?

꼭 좀 부탁드립니다.

#include <stdio.h>
#include <math.h>
#include <stdlib.h>
#include <sys/time.h>
#include <time.h>
#include <unistd.h>
 
#define e 2.7183 //used in sigmod function
#define randomize() srand(unsigned)time(NULL) //initializing rand function
#define random(n) (rand() % (n)) // limited region rand function
struct Dendrites  // connects each neuron and pass the signal
{	
	double Weight; //connction intensity
	//double back_Weight // It will be hard to make equation of complex network, so using back_connection weight would be better
	//struct Dendrite * Added_Dendrites // Using this we would be able to add_Neuron, It need more malloc
};
 
struct Neurons
{
	struct Dendrites *  Dendrite;
	int DendriteCount; //number fo dendrites
	double Bias;  
	double Value;  // The value to be passed to next layer of neurons
	double Delta; // the delta of neuron (used while leanrning)
	//struct Neuron * Added_Neuron // same purpose as Added_Neuron, we need one more step to access this variable (spend more time),
	//therefore, maybe there is better idea to add_Neuron or something like that. We can use realloc(var,size)
	//but need study more about realloc.
};
 
struct Layers //Layer containing number of neurons
{
	struct Neurons * Neuron; //neurons in the layer
	int NeuronCount;  // Number of neurons
	//struct Layer * Added_Layer // same purpose as Added_Dendrites
};
 
struct NeuralNetwork
{
	struct Layers * Layer;
	int LayerCount; 
	double LearningRate;
	//struct NeuralNetwork * Connected_NeuralNetwork //Connected would be better when we think the meaning
};
 
int Training_Count = 0;
double *OutTch;
struct timeval Ttime;
long TrainingTime=0;
//NeuralNetwork Network;
 
 
struct NeuralNetwork *  Initialize();
void CreateNet(struct NeuralNetwork *);
//void ShowOption (struct NeuralNetwork *);
void Layout(struct NeuralNetwork *);
void SupervisedTrain(struct NeuralNetwork * Network);
void Run(struct NeuralNetwork * Network);
double Activation(double );
void ShowResult (struct NeuralNetwork *);
void Free_Memory(struct NeuralNetwork *);
//void Option(struct NeuralNetwork * Network);
 
 
 
void main()
{
	struct NeuralNetwork * Network;
	Network = Initialize();
	CreateNet( Network );
	Layout( Network );
	//Option ( Network );
	SupervisedTrain( Network);
	Free_Memory (Network);
}
 
struct NeuralNetwork * Initialize()
{
	int i,j,k,LayerCount,NetworkCount=1,InputCount=0,OutTchCount=0;
	/* int NetworkCount; //When using MultiNetwork;
	   int *Layers_NeuronCount;*/
	int *NeuronCount;
	double *Input;
	long Check=0;
	FILE *fInput,*fOutTch;
 
	/* 
	   do{
	   printf("Define the Number of NetWork\n");
	   scanf("%d",NetworkCount);
	   }while( NetworkCount>1);*/ //when using MultiNetwork;
 
 
	for( i = 0 ; i <NetworkCount ; i++) //loop for MultiNetwork
	{
		do{
			printf("Define the Number of Layer(this number includes input and output layer)\n");
			scanf("%d",&LayerCount);
			/*printf("Define Network[%d]'s Number of Layer(this number includes input and output layer)\n",i);
			  scanf("%d",LayerCount[i]);*/ //when using MultiNetwork;
		}while( LayerCount<2); //while(LayerCount[i] >1);
	} 
	NeuronCount = (int *)malloc(sizeof(int) * LayerCount);
 
	fInput = fopen("./Input.txt","r");
	fseek(fInput,0L,SEEK_END);
	Check = ftell(fInput);
        //all the input shoulbe same precison
	Input = (double *)malloc(sizeof(double)* (   (Check-6)/5 )  );//input Count;
	fseek(fInput, 6L ,SEEK_SET); //Input\n = 6byte
	for( ;Check!=(ftell(fInput)+1) ; )
	{
		fscanf(fInput,"%lf",&Input[InputCount]);
		InputCount++;
	}
	NeuronCount[0]=InputCount;
 
	fOutTch = fopen("./OutTch.txt","r");
	fseek(fOutTch,0L,SEEK_END);
	Check = ftell(fOutTch);
        //all the input shoulbe same precison
	OutTch = (double *)malloc(sizeof(double)* (   (Check-7)/5 ));//Output Count;
	fseek(fOutTch, 7L ,SEEK_SET); //OutTch\n = 7byte
	for( ;Check!=(ftell(fOutTch)+1) ; )
	{
		fscanf(fOutTch,"%lf",&OutTch[OutTchCount]);
		OutTchCount++;
	}
 
	NeuronCount[LayerCount-1] = OutTchCount;
 
	//for( i = 0 ; i < NetworkCount ; i++)
	//for( j = 0 ; j < LayerCount[i] ;j++)//MultiNetwrok;
	for( i = 1 ; i <  LayerCount-1 ; i++) // for ( j = 0 ; j < LayerCount[i] ; j++) //MultiNetwork
	{
		do{
			printf("Define the number of Hidden Layer[%d]'s Neuron \n",i);
			scanf("%d", &NeuronCount[i]);
		}while(NeuronCount[i]<1);
 
	}
 
	//memmory allocation
 
	struct NeuralNetwork * Network;
	Network = (struct NeuralNetwork *)malloc(sizeof(struct NeuralNetwork) * NetworkCount);
	Network->LayerCount = LayerCount;
 
	Network->Layer =(struct Layers *)malloc(sizeof(struct Layers *) * LayerCount); 
	/*for (i = 0 ; i < NetworkCount ; i++)
	  {
	  Network[i].Layers = (struct Layer *)malloc(sizeof(struct Layer *) * LayerCount);
	  }*/ //MultiNetwork
 
	Network->Layer[0].NeuronCount = NeuronCount[0];
	for ( i = 1 ; i < LayerCount ; i++)
	{
		Network->Layer[i].NeuronCount = NeuronCount[i];
		Network->Layer[i].Neuron = (struct Neurons *)malloc(sizeof(struct Neurons *) * NeuronCount[i]);
 
	}
 
 
	// not
	for ( i = 1 ; i < LayerCount ; i++)
	{
		// To connet input and output ,malloc Layer[0].Neuron->Dendrite. Later I'll add connection;
		for(j = 0 ; j < NeuronCount[i] ; j++)
		{
			Network->Layer[i].Neuron[j].Dendrite = malloc(sizeof(struct Dendrites *) * NeuronCount[i-1]);
		}
	} 	
	Network->Layer[0].Neuron = (struct Neurons *)malloc(sizeof(struct Neurons * ) * NeuronCount[0]);
	//for input output connection
	for( i = 0 ; i < NeuronCount[0] ; i++)
	{
		Network->Layer[0].Neuron[i].Dendrite = malloc(sizeof(struct Dendrites *) * NeuronCount[LayerCount-1]);
	}
 
	for(i = 0 ;i < NeuronCount[0] ; i++)
	{
		Network->Layer[0].Neuron[i].Value =Input[i] ;
	}
 
	free(NeuronCount);
 
	free(Input);
	return Network;
}
 
void CreateNet( struct NeuralNetwork * Network)
{
	int i,j,k;
 
	do{
		printf(" Determine LearningRate (0~1]\n");
		scanf("%lf",&(Network->LearningRate));
	}while( Network->LearningRate >1 && Network->LearningRate <=0 ); 
 
 
	for(i=0 ; i < Network->LayerCount ; i++) //intialize all layers
	{
		for(j=0; j <  Network->Layer[i].NeuronCount ;j++)
		{
 
			//if dividing multi area using swtich use (10/3)=>0~2,3~5,6~8,9~10
			if(i==0) 
			{
			//Do nothing Because Layer'1'(Layer[0]) is input Layer
			//But need to initialize
			//This program supposes there's no bias at input Layer
			//This program use Weight as axon and dendrite and it exists in latter Neuron
			//It means no interconnection between 
			//input Layer with output Layer's Neuron 
			//It also means there's no meaning of weight in the first Layer's Neurons
			//Maybe we can use this realizing output-to-input feedback , Maybe easier way exists
 
			Network->Layer[i].Neuron[j].Bias = 0;
				for( k =0 ; k < Network->Layer[Network->LayerCount-1].NeuronCount ; k++)
				{
					Network->Layer[i].Neuron[j].Dendrite[k].Weight = 0;
				}
			}
			// I use random Weight // maybe it would be better
			else if( i == (Network->LayerCount -1)) 
			{
				Network->Layer[i].Neuron[j].Bias = random(100)/100.00;
				Network->Layer[i].Neuron[j].DendriteCount = Network->Layer[i-1].NeuronCount;
				Network->Layer[i].Neuron[j].Value = random(100)/100.00;
				Network->Layer[i].Neuron[j].Delta = random(100)/100.00;
				for( k =0 ; k < Network->Layer[i].Neuron[j].DendriteCount ; k++)
				{
				Network->Layer[i].Neuron[j].Dendrite[k].Weight = random(100)/100.00;
				}
			}
			else
			{
				Network->Layer[i].Neuron[j].Bias = random(100)/100.00;
				Network->Layer[i].Neuron[j].DendriteCount = Network->Layer[i-1].NeuronCount;
 
			//Initially there is output layer's neuron value, so I use random initial output;
			//but to test algorithm, ramdom initial output is not good? because randomly
			//fortunately there will be good result. So 0 initial output is better?
 
			//Network.Layers[i].Neurons[j].Value = random(100)/100.00
			//Because there is originally many connetion between Neurons, 
			//the above one is more practical?
				Network->Layer[i].Neuron[j].Value = random(100)/100.00;
				Network->Layer[i].Neuron[j].Delta = random(100)/100.00;
				for( k=0 ; k < Network->Layer[i].Neuron[j].DendriteCount ; k++)
				{
					Network->Layer[i].Neuron[j].Dendrite[k].Weight = random(100)/100.00;
				}
			}
		}
 
	}
 
}
 
//if anyCount is bigger than 1,000, should change %3d-> like %4d or something
void Layout(struct NeuralNetwork * Network)
{
	int i,j,k;
	FILE *fOutput,*fHiddenLayerValue,*fWeight,*fBias,*fDelta;
 
	//Output
	fOutput = fopen("Output.txt","w");
	fprintf(fOutput, "Output");
	for(i=0 ; i<=Network->Layer[0].NeuronCount ; i++)
	{
		fprintf(fOutput, "     Input(%3d) ",i+1);
	}
	for(i=0 ; i<=Network->Layer[0].NeuronCount ; i++)
	{
		fprintf(fOutput, "     %.2lf        ",Network->Layer[0].Neuron[i].Value);
	}
	fprintf(fOutput, "\n\nTraining_Count");
	fprintf(fOutput, " Time  ");
	for(i=1 ; i <= Network->Layer[Network->LayerCount-1].NeuronCount ; i++)
	{
		fprintf(fOutput, "OutTch(%3d) OutR(%3d) ",i,i);
		//fprintf(fOutput, " %.2lf    %.2lf ",OutTch,Network->Layer[Network->LayerCount-1].Neuron[i].Value);
	}
	fclose(fOutput);
 
	//HiddenLayerValue
	fHiddenLayerValue = fopen("HiddenLayerValue.txt","w");
	fprintf(fHiddenLayerValue, "Hidden Layer's NeuronValue\n");
	fprintf(fHiddenLayerValue, "Neuron(i,j)Value means Hidden Layer(i)'s Neuron(j)'s Value");
	fprintf(fHiddenLayerValue, "\n\nTraining_Count ");
	fprintf(fHiddenLayerValue, " Time  ");
	for( i = 1 ; i < Network->LayerCount -1 ; i++)
	{ 		
		for( j = 1 ; j <= Network->Layer[i].NeuronCount ; j++)
		{
			fprintf(fHiddenLayerValue, "Neuron(%3d,%3d)Value ",i,j);
		}
	}
	fclose(fHiddenLayerValue);
 
	//Weight
	fWeight = fopen("Weight.txt","w");
	fprintf(fWeight, "Connection Weight\n");
	fprintf(fWeight, "Weight(i,j,k) means Connectivity of Layer(i-1)'s Neuron(j) and Layer(i)'s Neuron(k)\n");
	fprintf(fWeight,"\n\nTraining_Count ");
	fprintf(fWeight," Time  ");
	for( i = 1 ; i <= Network->LayerCount - 1; i++)
	{
		for( j = 1 ; j <= Network->Layer[i].NeuronCount ; j++)
		{
			for( k = 1 ; k <= Network->Layer[i].Neuron[j].DendriteCount ; j++)
			{
				fprintf(fWeight,"Weight(%3d,%3d,%3d)",i+1,k,j);
			}
		}
	}
	fclose(fWeight);
 
	//Bias
	fBias = fopen("Bias","w");
	fprintf(fBias, "Bias\n");
	fprintf(fBias, "Neuron(i,j).Bias means Layer(i)'s Neuron(j)'s Bias");
	fprintf(fBias,"\n\nTraining_Count ");
	fprintf(fBias," Time  ");
	for( i = 2 ; i <= Network->LayerCount ; i++)
	{
		for( j = 1 ; j <= Network->Layer[i].NeuronCount ; j++)
		{
			fprintf(fBias,"Bias(%3d,%3d)  ",i,j);
		}
	}
	fclose(fBias);
 
	//Delta
	fDelta = fopen("Delta","w");
	fprintf(fDelta, "Delta\n");
	fprintf(fDelta, "Neuron(i,j).Bias means Layer(i)'s Neuron(j)'s Bias");
	fprintf(fDelta,"\n\nTraining_Count ");
	fprintf(fDelta," Time  ");
	for( i = 2 ; i <= Network->LayerCount ; i++)
	{
		for( j = 1 ; j <= Network->Layer[j].NeuronCount ; j++)
		{
			fprintf(fDelta,"Delta(%3d,%3d)  ",i,j);
		}
	}
	fclose(fDelta);
}	
 
 
 
 
 
void Run(struct NeuralNetwork * Network)
{
	int i,j,k;
 
	for(i=1 ; i < Network->LayerCount ; i++)
	{
		for(j=1 ; j<Network->Layer[i].NeuronCount ; j++)
		{
			//According to this code,at first training  Layers(2~ ) layers Neurons Value not change.(X)
			//connection Weight in latter Neuron So input data transfered to layer 2's Neuron
			for( k=0 ; k < Network->Layer[i-1].NeuronCount;k++) // Transfer sig from input layer to last hidden layer
			{
				Network->Layer[i].Neuron[j].Value = Network->Layer[i].Neuron[j].Value + Network->Layer[i-1].Neuron[k].Value * Network->Layer[i].Neuron[j].Dendrite[k].Weight;
			}
 
			Network->Layer[i].Neuron[j].Value = Activation( Network->Layer[i].Neuron[j].Value + Network->Layer[i].Neuron[j].Bias );
		}
	}
}  
 
 
 
 
// if value is positve (maybe we have to), result of Activation(resulted value) become smaller as before value 
// is bigger
 
// because of this value is always in 0~1
double Activation(double Value)
{
	Value = exp(Value * (-1) );
	Value = 1 / (1+Value);
	return Value;
}
 
 
//Calculate the difference between output values desired and output values produced.
//Using that difference, adjust the values of bias and weights accordingly.Using that difference, adjust the values of bias and weights accordingly.
 
void SupervisedTrain(struct NeuralNetwork * Network)
{
	int i,j,k,l,m,loop_Count = 0;
	long time1,time2;
	printf("Decide Training repeat Count\n");
	scanf("%d",&Training_Count);
 
	for ( ; ; )
	{
		gettimeofday(&Ttime,NULL);
		time1 = Ttime.tv_usec; 
		for ( l = loop_Count + 1 ; l <= Training_Count ; l++)
		{
			if ( l % 100 != 0 )
			{		
 
				Run( Network );
				for ( i=0 ; i < Network->Layer[Network->LayerCount-1].NeuronCount ; i++) // loop output layer's neuronCount times
				{
	// gain new delta value of Output Layer's Neuron
	// if Resulted output is 0 or 1 delta value =0 ,
	// but it never happen because of Activation function
	// Resulted output is same as desired data (what we putted) delta value is also 0
	// delta value is related with learning
					Network->Layer[Network->LayerCount -1 ].Neuron[i].Delta =  Network->Layer[Network->LayerCount-1].Neuron[i].Value * (1 - Network->Layer[Network->LayerCount - 1].Neuron[i].Value ) * (OutTch[i] - Network->Layer[Network->LayerCount-1].Neuron[i].Value) ;
					for(j = Network->LayerCount - 2 ; j > 0 ; j--) // from Last hidden Layer to First hidden Layer
					{
		//about delta value of hidden layers's Neuron 
						for( k =0 ; k < Network->Layer[j].NeuronCount ; k++)  // loop each Layer's.NeuronCount times
		// hidden layer's delta is determined by next layer's Delta and Layer connection weight and Value
		// All Delta influence each other
						{
 
							Network->Layer[j].Neuron[k].Delta = Network->Layer[j].Neuron[k].Value * (1 - Network->Layer[j].Neuron[k].Value) * Network->Layer[j+1].Neuron[i].Dendrite[k].Weight * Network->Layer[j+1].Neuron[i].Delta;
 
						}				
					}
				}
		//i > 1 because input has no bias
				for(i = Network->LayerCount - 1 ; i > 1 ; i--) //from output layer to first hidden layer
				{
					for(j = 0 ; j < Network->Layer[i].NeuronCount ; j++)
					{
 
						Network->Layer[i].Neuron[j].Bias = Network->Layer[i].Neuron[j].Bias + (Network->LearningRate * 1 * Network->Layer[i].Neuron[j].Delta);
					}				
				}
				for(i = Network->LayerCount - 2 ; i > 0 ; i--)
				{
	//adjusting connection weight
					for( j = 0 ; Network->Layer[i].NeuronCount ; j++)
					{
						for( k = 0 ; k  > Network->Layer[i].Neuron[j].DendriteCount ; k++)
						{
							Network->Layer[i].Neuron[j].Dendrite[k].Weight = Network->Layer[i].Neuron[j].Dendrite[k].Weight + Network->LearningRate * Network->Layer[i-1].Neuron[k].Value*Network->Layer[i].Neuron[j].Delta;
						}	
					}
				}
			}
 
			else
			{
				gettimeofday(&Ttime,NULL);
				time2 = Ttime.tv_usec;
				TrainingTime += (time2 - time1);
				ShowResult(Network);
				//need to learn about void func pointer
			}
		}
 
		ShowResult(Network);
 
		printf("Do you want procceed more Training? If you want type ' 1 '\n");
		scanf("%d",&j);
		if(j == 1)
		{
			printf("How many times do you want repeat?\n");
			scanf("%d",&i);
			loop_Count=Training_Count;
			Training_Count+=i;
		}
		else
			break;
	}
 
}
 
void ShowResult(struct NeuralNetwork * Network)
{
	int i,j,k;
 
 
 
	FILE *fOutput,*fHiddenLayerValue,*fWeight,*fBias,*fDelta;
 
	fOutput=fopen("Output.txt","a");
	fprintf(fOutput,"\n%12d   %6.3lf",Training_Count,((double)TrainingTime)/1000);
	for(i = 0 ; i <= Network->Layer[Network->LayerCount -1].NeuronCount ; i++)
	{
		fprintf(fOutput,"   %.2lf       %.2lf   ",OutTch[i],Network->Layer[Network->LayerCount-1].Neuron[i].Value);
	}
	fclose(fOutput);
 
	fHiddenLayerValue=fopen("HiddenLayerValue.txt","a");
	fprintf(fHiddenLayerValue,"\n%12d   %6.3lf",Training_Count,((double)TrainingTime)/1000 );
	for( i = 1 ; i < Network->LayerCount -1 ; i++)
	{ 		
		for( j = 1 ; j <= Network->Layer[i].NeuronCount ; j++)
		{
			fprintf(fHiddenLayerValue, "%18.2lf  ",Network->Layer[i].Neuron[j-1].Value);
		}
	}
	fclose(fOutput);
 
	fWeight = fopen("Weight.txt","a");
	fprintf(fWeight,"\n%12d   %6.3lf",Training_Count,((double)TrainingTime)/1000);
	for( i = 1 ; i <= Network->LayerCount -1 ; i++)
	{
		for( j = 0 ; j < Network->Layer[i].NeuronCount ; j++)
		{
			for( k = 0 ; k < Network->Layer[i].Neuron[j].DendriteCount ; j++)
			{
				fprintf(fWeight,"%15.2lf    ",Network->Layer[i].Neuron[j].Dendrite[k].Weight);
			}
		}
	}
	fclose(fWeight);
 
	fBias=fopen("Bias.txt","a");
	fprintf(fBias,"\n%12d   %6.3lf",Training_Count,((double)TrainingTime)/1000);
	for(i = 1 ; i < Network->LayerCount ; i++)
	{
		for( j = 0 ; j < Network->Layer[i].NeuronCount ; j++)
		{
			fprintf(fBias,"%11.2lf   ",Network->Layer[i].Neuron[j].Bias);
		}	
	}
	fclose(fBias);
 
	fDelta=fopen("Delta.txt","a");
	fprintf(fDelta,"\n%12d   %6.3lf",Training_Count,((double)TrainingTime)/1000);
	for(i = 1 ; i < Network->LayerCount-1 ; i++)
	{	
		for(j = 0 ; j < Network->Layer[i].NeuronCount ; j++)
		{
			fprintf(fOutput,"%12.2lf   ",Network->Layer[i].Neuron[j].Delta);
		}
	}
	fclose(fDelta);
 
	printf("number of repetion is %d\n",Training_Count);
	printf("Desired Output is\n");
	//number of Tranining data is same as number of Output Layer's Neuron
	for( i = 0 ; i < Network->Layer[Network->LayerCount - 1].NeuronCount ; i++)
	{
		printf(" %.2lf  ",OutTch[i]);
	}
 
	printf("\n after %d times Output Result is\n",Training_Count);
	for( i = 0 ; i < Network->Layer[Network->LayerCount - 1].NeuronCount ; i++) 
	{
		printf(" %.2lf  ",Network->Layer[Network->LayerCount - 1].Neuron[i].Value);
	}
	printf("\n");
 
 
	printf(" after %d times  Hidden_Layers NeuronsValue\n",Training_Count);
	for(i = 1 ; i < Network->LayerCount - 1 ; i++)
	{
		printf("after %d times  Layer[%d].NeuronsValue\n",Training_Count,i);
 
		for( j =0 ;j < Network->Layer[i].NeuronCount ; j++)
		{
			printf(" %.2lf  ",Network->Layer[i].Neuron[j].Value);
		}
	}
 
	printf("after %d times  Neurons_Dendrite's Weight\n",Training_Count);
	//Connetion Weight
	for(i = i ; i < Network->LayerCount ; i++)
	{
 
		//Output Layer's Weight has no meaning Because Output is not connected Input	
		for( j =0 ;j < Network->Layer[i].NeuronCount - 1 ; j++)
		{
			printf("after %d times Layer[%d].Neuron[%d].Dendrites\n",Training_Count,i,j);
 
			for( k = 0 ; k < Network->Layer[i].Neuron[j].DendriteCount ; k++)
			{
				printf(" %.2lf  ",Network->Layer[i].Neuron[j].Dendrite[k].Weight);
			}
		}
	}
 
	printf("after %d times NeuronsBias\n",Training_Count);
	//Layer[0], input Layer has no Bias
	for(i = i ; i < Network->LayerCount ; i++)
	{
		printf("after %d times Layer[%d].NeuronsBias\n",Training_Count,i);
			for( j =0 ;j < Network->Layer[i].NeuronCount ; j++)
		{
			printf(" %.2lf  ",Network->Layer[i].Neuron[j].Bias);
		}
	}
	printf("after %d times NeuronsDelta\n",Training_Count);
	for(i = i ; i < Network->LayerCount ; i++)
	{
		printf("after %d times Layer[%d].NeuronsDelta\n",Training_Count,i);
 
			for( j =0 ;j < Network->Layer[i].NeuronCount ; j++)
		{
			printf(" %.2lf  ",Network->Layer[i].Neuron[j].Delta);
		}
	}
}
 
void Free_Memory(struct NeuralNetwork * Network)
{
	int i,j;
	for( i = 0 ; i < Network->LayerCount ; i++)
	{
		for(j = 0 ; j < Network->Layer[i].NeuronCount ; j++)
		{
 
			free(Network->Layer[i].Neuron[j].Dendrite);
		}
 
 
		free(Network->Layer[i].Neuron);
	}
 
	free(Network->Layer);
 
	free(Network);
 
	free(OutTch);
 
} 
익명 사용자의 이미지

전체 코드가 없어서 모르겠지만 위의 코드 설명이 전부라면 Dendrite[k]가 할당되지 않은 것 같네요.

	for( i = 0 ; i < NeuronCount[0] ; i++)
	{
		Network->Layer[0].Neuron[i].Dendrite = malloc(sizeof(struct Dendrites *) * NeuronCount[LayerCount-1]);
 
                for(j = 0; j < NeuronCount[LayerCount - 1]; j++)
                {
                        Network->Layer[0].Neuron[i].Dendrite[j] = malloc(sizeof(struct Dendrites));
                }
        }

익명 사용자의 이미지

7: Network->Layer[0].Neuron[6].Dendrite = (struct Dendrites *) 0x608e70
6: Network->Layer[0].Neuron[5].Dendrite = (struct Dendrites *) 0x608de0
5: Network->Layer[0].Neuron[4].Dendrite = (struct Dendrites *) 0x608d50
4: Network->Layer[0].Neuron[3].Dendrite = (struct Dendrites *) 0x608cc0
3: Network->Layer[0].Neuron[2].Dendrite = (struct Dendrites *) 0x608c30
2: Network->Layer[0].Neuron[1].Dendrite = (struct Dendrites *) 0x608ba0
1: Network->Layer[0].Neuron[0].Dendrite = (struct Dendrites *) 0x608b10

이 할당을 받은 상태로 메인으로 갔다가 바로 문제의 루프가 있는곳으로 갑니다

익명 사용자의 이미지

(gdb) display Network->Layer[0].Neuron[4].Dendrite[3]
1: Network->Layer[0].Neuron[4].Dendrite[3] = {Weight = 0}
(gdb) display Network->Layer[0].Neuron[4].Dendrite[2]
2: Network->Layer[0].Neuron[4].Dendrite[2] = {Weight = 0}
(gdb) display Network->Layer[0].Neuron[4].Dendrite[1]
3: Network->Layer[0].Neuron[4].Dendrite[1] = {Weight = 0}
(gdb) display &Network->Layer[0].Neuron[4].Dendrite[1]
4: &Network->Layer[0].Neuron[4].Dendrite[1] = (struct Dendrites *) 0x605d18
(gdb) display &Network->Layer[0].Neuron[4].Dendrite[3]
5: &Network->Layer[0].Neuron[4].Dendrite[3] = (struct Dendrites *) 0x605d28
(gdb) display &Network->Layer[0].Neuron[4].Dendrite[4]
6: &Network->Layer[0].Neuron[4].Dendrite[4] = (struct Dendrites *) 0x605d30
(gdb) display &Network->Layer[0].Neuron[4].Dendrite[20]
7: &Network->Layer[0].Neuron[4].Dendrite[20] = (struct Dendrites *) 0x605db0

정상적으로 잡힌것 같은데 무엇이 문제인지 모르겠습니다.

익명 사용자의 이미지

이렇게 고쳐야 합니다.

        for( i = 0 ; i < NeuronCount[0] ; i++)
	{
		Network->Layer[0].Neuron[i].Dendrite = malloc(sizeof(struct Dendrites) * NeuronCount[LayerCount-1]);
        }

익명 사용자의 이미지

다른곳에서도 크기를 잘못 할당해줘서 결국 에러가 났던거 같군요
고맙습니다

여러군데 다 포인터 크기로 할당을 했었네요 ㅎㅎ

댓글 달기

Filtered HTML

  • 텍스트에 BBCode 태그를 사용할 수 있습니다. URL은 자동으로 링크 됩니다.
  • 사용할 수 있는 HTML 태그: <p><div><span><br><a><em><strong><del><ins><b><i><u><s><pre><code><cite><blockquote><ul><ol><li><dl><dt><dd><table><tr><td><th><thead><tbody><h1><h2><h3><h4><h5><h6><img><embed><object><param><hr>
  • 다음 태그를 이용하여 소스 코드 구문 강조를 할 수 있습니다: <code>, <blockcode>, <apache>, <applescript>, <autoconf>, <awk>, <bash>, <c>, <cpp>, <css>, <diff>, <drupal5>, <drupal6>, <gdb>, <html>, <html5>, <java>, <javascript>, <ldif>, <lua>, <make>, <mysql>, <perl>, <perl6>, <php>, <pgsql>, <proftpd>, <python>, <reg>, <spec>, <ruby>. 지원하는 태그 형식: <foo>, [foo].
  • web 주소와/이메일 주소를 클릭할 수 있는 링크로 자동으로 바꿉니다.

BBCode

  • 텍스트에 BBCode 태그를 사용할 수 있습니다. URL은 자동으로 링크 됩니다.
  • 다음 태그를 이용하여 소스 코드 구문 강조를 할 수 있습니다: <code>, <blockcode>, <apache>, <applescript>, <autoconf>, <awk>, <bash>, <c>, <cpp>, <css>, <diff>, <drupal5>, <drupal6>, <gdb>, <html>, <html5>, <java>, <javascript>, <ldif>, <lua>, <make>, <mysql>, <perl>, <perl6>, <php>, <pgsql>, <proftpd>, <python>, <reg>, <spec>, <ruby>. 지원하는 태그 형식: <foo>, [foo].
  • 사용할 수 있는 HTML 태그: <p><div><span><br><a><em><strong><del><ins><b><i><u><s><pre><code><cite><blockquote><ul><ol><li><dl><dt><dd><table><tr><td><th><thead><tbody><h1><h2><h3><h4><h5><h6><img><embed><object><param>
  • web 주소와/이메일 주소를 클릭할 수 있는 링크로 자동으로 바꿉니다.

Textile

  • 다음 태그를 이용하여 소스 코드 구문 강조를 할 수 있습니다: <code>, <blockcode>, <apache>, <applescript>, <autoconf>, <awk>, <bash>, <c>, <cpp>, <css>, <diff>, <drupal5>, <drupal6>, <gdb>, <html>, <html5>, <java>, <javascript>, <ldif>, <lua>, <make>, <mysql>, <perl>, <perl6>, <php>, <pgsql>, <proftpd>, <python>, <reg>, <spec>, <ruby>. 지원하는 태그 형식: <foo>, [foo].
  • You can use Textile markup to format text.
  • 사용할 수 있는 HTML 태그: <p><div><span><br><a><em><strong><del><ins><b><i><u><s><pre><code><cite><blockquote><ul><ol><li><dl><dt><dd><table><tr><td><th><thead><tbody><h1><h2><h3><h4><h5><h6><img><embed><object><param><hr>

Markdown

  • 다음 태그를 이용하여 소스 코드 구문 강조를 할 수 있습니다: <code>, <blockcode>, <apache>, <applescript>, <autoconf>, <awk>, <bash>, <c>, <cpp>, <css>, <diff>, <drupal5>, <drupal6>, <gdb>, <html>, <html5>, <java>, <javascript>, <ldif>, <lua>, <make>, <mysql>, <perl>, <perl6>, <php>, <pgsql>, <proftpd>, <python>, <reg>, <spec>, <ruby>. 지원하는 태그 형식: <foo>, [foo].
  • Quick Tips:
    • Two or more spaces at a line's end = Line break
    • Double returns = Paragraph
    • *Single asterisks* or _single underscores_ = Emphasis
    • **Double** or __double__ = Strong
    • This is [a link](http://the.link.example.com "The optional title text")
    For complete details on the Markdown syntax, see the Markdown documentation and Markdown Extra documentation for tables, footnotes, and more.
  • web 주소와/이메일 주소를 클릭할 수 있는 링크로 자동으로 바꿉니다.
  • 사용할 수 있는 HTML 태그: <p><div><span><br><a><em><strong><del><ins><b><i><u><s><pre><code><cite><blockquote><ul><ol><li><dl><dt><dd><table><tr><td><th><thead><tbody><h1><h2><h3><h4><h5><h6><img><embed><object><param><hr>

Plain text

  • HTML 태그를 사용할 수 없습니다.
  • web 주소와/이메일 주소를 클릭할 수 있는 링크로 자동으로 바꿉니다.
  • 줄과 단락은 자동으로 분리됩니다.
댓글 첨부 파일
이 댓글에 이미지나 파일을 업로드 합니다.
파일 크기는 8 MB보다 작아야 합니다.
허용할 파일 형식: txt pdf doc xls gif jpg jpeg mp3 png rar zip.
CAPTCHA
이것은 자동으로 스팸을 올리는 것을 막기 위해서 제공됩니다.