Board index » delphi » Re: Memory Leak?
Quinn Wildman (CodeGear Developer Support)
![]() Delphi Developer |
Quinn Wildman (CodeGear Developer Support)
![]() Delphi Developer |
Re: Memory Leak?2006-12-13 01:36:18 PM delphi278 CommitRetaining by definition keeps the transaction active. To close the transaction you must commit or rollback. Bob writes: QuoteQuinn Wildman (CodeGear Developer Support) writes: |
Craig Stuntz [TeamB]
![]() Delphi Developer |
2006-12-13 07:56:31 PM
Re: Memory Leak?
Quinn Wildman (CodeGear Developer Support) writes:
QuoteCommitRetaining by definition keeps the transaction active. one. (Since the context of the old one was READ COMMITTED and you just committed your changes, the practical effect of this is simply that the server has a new transaction but to the client it looks like the old one is still running.) Without READ COMMITTED it is as Quinn describes. With all that said, however, I think the "leak" is on the client side in this case. -- Craig Stuntz [TeamB] ?Vertex Systems Corp. ?Columbus, OH Delphi/InterBase Weblog : blogs.teamb.com/craigstuntz Everything You Need to Know About InterBase Character Sets: blogs.teamb.com/craigstuntz/articles/403.aspx |
Bill Todd
![]() Delphi Developer |
2006-12-15 04:57:34 AM
Re: Memory Leak?
Paul writes:
QuoteSo, what's the difference between CommitRetaining and doing nothing? transaction continues. CommitRetaining was added to provide a way to commit a transaction without closing all of the datasets that were opened within the transaction context. If faced with this problem today no one would even consider adding CommitRetaining to InterBase because the right way to solve this problem is to use ClientDataSets for editing. -- Bill Todd (TeamB) |
Paul
![]() Delphi Developer |
2006-12-15 05:34:10 AM
Re: Memory Leak?
"Quinn Wildman (CodeGear Developer Support)"
<XXXX@XXXXX.COM>writes: QuoteCommitRetaining is not a good choice. You probably want commit instead. -- plinehan __at__ yahoo __dot__ __com__ XP Pro, SP 2, Oracle, 9.2.0.1.0 (Enterprise Ed.) Interbase 6.0.1.0; When asking database related questions, please give other posters some clues, like operating system, version of db being used and DDL. The exact text and/or number of error messages is useful (!= "it didn't work!"). Thanks. Furthermore, as a courtesy to those who spend time analysing and attempting to help, please do not top post. |
Roger
![]() Delphi Developer |
2007-04-11 04:13:39 AM
Re: Memory Leak?
I have found this code and was wondering if there is a memory leak
associated with szErrText? It seems to be allocated memory every time it is called but is only freed in the default case. void __fastcall GetErrorString(int nError, char *&szErrText) { szErrText = (char*)malloc(255); switch(nError) { case LINEERR_ALLOCATED: strcpy(szErrText,"LINEERR_ALLOCATED"); break; case LINEERR_BADDEVICEID: strcpy(szErrText,"Line Error - Bad Device ID"); break; . . . default: free(szErrText); DWORD dwReturn = FormatMessage(FORMAT_MESSAGE_ALLOCATE_BUFFER // source and processing options |FORMAT_MESSAGE_FROM_SYSTEM |FORMAT_MESSAGE_IGNORE_INSERTS, NULL, // pointer to message source nError, // requested message identifier 0, // language identifier for requested message (LPTSTR)&szErrText, // pointer to message buffer 0, // Minimum size of message buffer NULL); // address of array of message inserts }; // End of Switch } Roger |
Ed Mulroy
![]() Delphi Developer |
2007-04-11 04:43:53 AM
Re: Memory Leak?
It is not a memory leak. The problem is more insidious.
The calling argument of the function is a reference to a char*. Presumably the variable is to contain the memory which is allocated to hold the error message. If the error matches that of one of the case statements then the function returns with szErrText containing a pointer to a 255 character block allocated by a call to the C++ library function 'malloc' or NULL if the call to 'malloc' failed. If the error does not match, then the default statement is executed, the memory allocated with 'malloc' is freed and 'FormatMessage' calls 'LocalAlloc' to allocate an array of characters for the error message. Because a reference to char* is the calling argument, it is effectively used as the return value of the function, letting the calling code free the allocated memory. However this causes a problem. Memory allocated with 'malloc' must be freed with a call to 'free'. Memory allocated with 'LocalAlloc' is not allocated via the C++ library and should be freed with a call to 'LocalFree' and not by a call to 'free'. Nothing in what the function does provides an indication to the caller of what must be used to free the allocated memory. . Ed QuoteRoger writes |
Roger
![]() Delphi Developer |
2007-04-11 06:22:02 AM
Re: Memory Leak?
Thanks, Ed
|
Remy Lebeau (TeamB)
![]() Delphi Developer |
2007-04-11 07:07:23 AM
Re: Memory Leak?
"Roger" <XXXX@XXXXX.COM>writes
QuoteI have found this code and was wondering if there is a the caller, so it is the caller's responsibility to free it when it is done using it. QuoteIt seems to be allocated memory every time it is called but is only allocate its own memory block instead. Which in itself is another bug, because now there are 2 different memory managers involved, and the caller has no way to know which one was actually used to allocate the memory block is being returned. So there is a 50/50 risk that the caller can not free the memory correctly. There are three ways to fix this: 1) change GetErrorString() to use LocalAlloc() instead of malloc(). FormatMessage() uses LocalAlloc() internally. This way, the caller can use LocalFree() regardless of which error number is processed: void __fastcall GetErrorString(int nError, char* &szErrText) { szErrText = (char*) LocalAlloc(LPTR, 256); if( !szErrText ) return; switch( nError ) { case LINEERR_ALLOCATED: strncpy(szErrText, "LINEERR_ALLOCATED", 255); break; case LINEERR_BADDEVICEID: strncpy(szErrText, "Line Error - Bad Device ID", 255); break; //... default: FormatMessageA(FORMAT_MESSAGE_FROM_SYSTEM | FORMAT_MESSAGE_IGNORE_INSERTS, NULL, nError, 0, szErrText, 255, NULL); break; } } { char *lpErrMsg; GetErrorString(SomeErrorCode, lpErrMsg); //... LocalFree(lpErrMsg); } 2) have GetErrorString() continue to use malloc(), but do not free it when calling FormatMessage(). Simply have FormatString() fill it in instead of allocating a new memory block. Then the caller can use free() regardless of the error code, ie: void __fastcall GetErrorString(int nError, char* &szErrText) { szErrText = (char*) malloc(256); if( !szErrText ) return; memset(szErrText, 0, 256); switch( nError ) { case LINEERR_ALLOCATED: strncpy(szErrText, "LINEERR_ALLOCATED", 255); break; case LINEERR_BADDEVICEID: strncpy(szErrText, "Line Error - Bad Device ID", 255); break; //... default: FormatMessageA(FORMAT_MESSAGE_FROM_SYSTEM | FORMAT_MESSAGE_IGNORE_INSERTS, NULL, nError, 0, szErrText, 255, NULL); break; } } { char *lpErrMsg; GetErrorString(SomeErrorCode, lpErrMsg); //... free(lpErrMsg); } 3) have the caller allocate its own memory buffer any way it wishes, and then have GetErrorString() simply fill it in. That way, the caller knows exactly how the memory was allocated and can free it accordingly, ie: void __fastcall GetErrorString(int nError, char *szErrText, int iMaxLen) { memset(szErrText, 0, iMaxLen); switch( nError ) { case LINEERR_ALLOCATED: strncpy(szErrText, "LINEERR_ALLOCATED", iMaxLen); break; case LINEERR_BADDEVICEID: strncpy(szErrText, "Line Error - Bad Device ID", iMaxLen); break; //... default: FormatMessageA(FORMAT_MESSAGE_FROM_SYSTEM | FORMAT_MESSAGE_IGNORE_INSERTS, NULL, nError, 0, szErrText, iMaxLen, NULL); break; } } { char szErrMsg[256]; GetErrorString(SomeErrorCode, szErrMsg, 256); //... } { char *szErrMsg = new char[256]; GetErrorString(SomeErrorCode, szErrMsg, 256); //... delete[] szErrMsg; } Now, with that said - keep in mind that FormatMessage() is Unicode-enabled when the UNICODE condition is set for the project. You will notice in the examples above that I am calling FormatMessageA() directly instead of FormatMessage() generically. Since GetErrorString() is forcing the caller to always use a char* string, it should force FormatMessage() to do the same, or else the code will fail to compile when UNICODE is enabled, as FormatMessage() would be expecting a wchar_t* instead of a char*. If you want to allow Unicode, though, then GetErrorString() should accept an LPTSTR parameter instead of a char* parameter, ie: #ifdef UNICODE #define strncpy_t wcsncpy #else #define strncpy_t strncpy #endif void __fastcall GetErrorString(int nError, LPTSTR szErrText, int iMaxLen) { memset(szErrText, 0, sizeof(TCHAR) * iMaxLen); switch( nError ) { case LINEERR_ALLOCATED: strncpy_t(szErrText, TEXT("LINEERR_ALLOCATED"), iMaxLen); break; case LINEERR_BADDEVICEID: strncpy_t(szErrText, TEXT("Line Error - Bad Device ID"), iMaxLen); break; //... default: FormatMessage(FORMAT_MESSAGE_FROM_SYSTEM | FORMAT_MESSAGE_IGNORE_INSERTS, NULL, nError, 0, szErrText, iMaxLen, NULL); break; } } { TCHAR szErrMsg[256]; GetErrorString(SomeErrorCode, szErrMsg, 256); //... } { LPTSTR szErrMsg = new TCHAR[256]; GetErrorString(SomeErrorCode, szErrMsg, 256); //... delete[] szErrMsg; } Gambit |
Roger
![]() Delphi Developer |
2007-04-11 07:26:21 AM
Re: Memory Leak?
Wow! Remy, thanks for the comprehensive answer.
Roger |
Vertuas
![]() Delphi Developer |
2007-04-16 05:24:07 AM
Re: Memory Leak?
"Alien" <XXXX@XXXXX.COM>writes
QuoteUsing Delphi7 you need to free this memory. |